From 90d1290522c59813825b78ceaebe313a9f286b5f Mon Sep 17 00:00:00 2001 From: "M.Gough" <60232355+mgough-970@users.noreply.github.com> Date: Thu, 16 Nov 2023 09:50:41 -0500 Subject: [PATCH 01/30] Update requirements.txt Updated == to >=, included crds --- notebooks/STIS/custom_ccd_darks/requirements.txt | 13 +++++++------ 1 file changed, 7 insertions(+), 6 deletions(-) diff --git a/notebooks/STIS/custom_ccd_darks/requirements.txt b/notebooks/STIS/custom_ccd_darks/requirements.txt index f3ec2de4e..7fd21a7aa 100644 --- a/notebooks/STIS/custom_ccd_darks/requirements.txt +++ b/notebooks/STIS/custom_ccd_darks/requirements.txt @@ -1,6 +1,7 @@ -astropy==5.2.1 -astroquery==0.4.6 -matplotlib==3.7.0 -numpy==1.23.4 -refstis==0.6.1 -stistools==1.4.4 +astropy>=5.2.1 +astroquery>=0.4.6 +matplotlib>=3.7.0 +numpy>=1.23.4 +refstis>=0.6.1 +stistools>=1.4.4 +crds>=11.17 From 058cae755b3746fa78eb162dfa5282b57b6f16e7 Mon Sep 17 00:00:00 2001 From: "M.Gough" <60232355+mgough-970@users.noreply.github.com> Date: Fri, 17 Nov 2023 14:31:36 -0500 Subject: [PATCH 02/30] Update requirements.txt Updated requirements versions and added in scikit-image to fix notebook execution error. --- notebooks/ACS/acs_findsat_mrt/requirements.txt | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/notebooks/ACS/acs_findsat_mrt/requirements.txt b/notebooks/ACS/acs_findsat_mrt/requirements.txt index c9925552e..f01f1a804 100644 --- a/notebooks/ACS/acs_findsat_mrt/requirements.txt +++ b/notebooks/ACS/acs_findsat_mrt/requirements.txt @@ -1,6 +1,6 @@ -acstools==3.6.1 -astropy==5.2.1 +acstools==3.7.0 +astropy==5.3.4 astroquery==0.4.6 -matplotlib==3.7.0 -numpy==1.23.4 -photutils==1.9.0 \ No newline at end of file +matplotlib==3.8.1 +numpy==1.26.2 +scikit-image==0.22.0 From ee81fa59f8a778183d9e9ac1ea0d34de03be1209 Mon Sep 17 00:00:00 2001 From: "M.Gough" <60232355+mgough-970@users.noreply.github.com> Date: Fri, 17 Nov 2023 14:36:57 -0500 Subject: [PATCH 03/30] Update requirements.txt Added photutils for notebook execution --- notebooks/ACS/acs_findsat_mrt/requirements.txt | 1 + 1 file changed, 1 insertion(+) diff --git a/notebooks/ACS/acs_findsat_mrt/requirements.txt b/notebooks/ACS/acs_findsat_mrt/requirements.txt index f01f1a804..132c59710 100644 --- a/notebooks/ACS/acs_findsat_mrt/requirements.txt +++ b/notebooks/ACS/acs_findsat_mrt/requirements.txt @@ -4,3 +4,4 @@ astroquery==0.4.6 matplotlib==3.8.1 numpy==1.26.2 scikit-image==0.22.0 +photutils==1.9.0 From 31cbf7caa88b908c429032795369cf43881d0ab1 Mon Sep 17 00:00:00 2001 From: Michael Dulude Date: Fri, 17 Nov 2023 16:09:01 -0500 Subject: [PATCH 04/30] Add WFC3 notebook 'calwf3_with_v1.0_PCTE.ipynb' (#100) * Added notebook-level requirements.txt file. * calwf3_with_v1.0_PCTE.ipynb: cleared notebook outputs * background_median.py: added trailing space * _config.yml: removed calwf3_with_v1.0_PCTE.ipynb notebook from exclude_list * _toc.yml: uncommented line pointing to calwf3_with_v1.0_PCTE.ipynb notebook. * Create pre-requirements.sh This notebook relies on an old version of `hstcal` and therefore an old version of `calwf3`. Every `hstcal` version > 2.5.0 and every corresponding `calwf3` version > 3.5.2 will employ the v2.0 PCTE correction. Since this notebook's purpose is to demonstrate how to use the v1.0 PCTE correction, we must ensure `hstcal` version 2.5.0 is installed and used. I don't believe this version (2.5.0) is offered on conda-forge. * Update README.md changing environment creation command to use the `requirements.txt` file * Update background_median.py PEP8 compliance * Update requirements.txt pinning the packages to the versions in my environment where the notebook runs successfully * Update calwf3_with_v1.0_PCTE.ipynb mostly minor edits to update the notebook for being hosted in hst_notebooks repo * Update calwf3_with_v1.0_PCTE.ipynb updates for PEP8 compliance * Update background_median.py adding blank lines for PEP8 compliance * Update calwf3_with_v1.0_PCTE.ipynb more PEP8 compliance changes * Update README.md slight change to the workflow for creating the virtual environment * Update requirements.txt removing version pins and adding `crds` * Update calwf3_with_v1.0_PCTE.ipynb adding `crds bestref` to pull ref files and `crds sync` to pull necessary archived files * Update calwf3_with_v1.0_PCTE.ipynb remove astropy sigma clip stats since it isn't used * Update calwf3_with_v1.0_PCTE.ipynb made better variable names and added logic to check if the FLT file already exists --------- Co-authored-by: bjkuhn --- _config.yml | 1 - _toc.yml | 2 +- notebooks/WFC3/calwf3_v1.0_cte/README.md | 13 +- .../calwf3_with_v1.0_PCTE.ipynb | 500 +++++++++++------- .../example/background_median.py | 16 +- .../WFC3/calwf3_v1.0_cte/pre-requirements.sh | 1 + .../WFC3/calwf3_v1.0_cte/requirements.txt | 9 + 7 files changed, 324 insertions(+), 218 deletions(-) create mode 100644 notebooks/WFC3/calwf3_v1.0_cte/pre-requirements.sh create mode 100644 notebooks/WFC3/calwf3_v1.0_cte/requirements.txt diff --git a/_config.yml b/_config.yml index acd78947c..0ec637038 100644 --- a/_config.yml +++ b/_config.yml @@ -45,7 +45,6 @@ exclude_patterns: [notebooks/DrizzlePac/align_mosaics/align_mosaics.ipynb, notebooks/DrizzlePac/sky_matching/sky_matching.ipynb, notebooks/DrizzlePac/use_ds9_regions_in_tweakreg/use_ds9_regions_in_tweakreg.ipynb, notebooks/WFC3/calwf3_recalibration/calwf3_recal_tvb.ipynb, - notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb, notebooks/WFC3/dash/dash.ipynb, notebooks/WFC3/exception_report/wfc3_exception_report.ipynb, notebooks/WFC3/filter_transformations/filter_transformations.ipynb, diff --git a/_toc.yml b/_toc.yml index 33c90247b..3703fb2aa 100644 --- a/_toc.yml +++ b/_toc.yml @@ -60,7 +60,7 @@ parts: chapters: - file: notebooks/WFC3/README.md # - file: notebooks/WFC3/calwf3_recalibration/calwf3_recal_tvb.ipynb -# - file: notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb + - file: notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb # - file: notebooks/WFC3/dash/dash.ipynb # - file: notebooks/WFC3/exception_report/wfc3_exception_report.ipynb # - file: notebooks/WFC3/filter_transformations/filter_transformations.ipynb diff --git a/notebooks/WFC3/calwf3_v1.0_cte/README.md b/notebooks/WFC3/calwf3_v1.0_cte/README.md index 0e3469566..cab8df072 100755 --- a/notebooks/WFC3/calwf3_v1.0_cte/README.md +++ b/notebooks/WFC3/calwf3_v1.0_cte/README.md @@ -1,19 +1,20 @@ This directory, once cloned from the repository, should contain this `README.md`, the Jupyter Notebook `calwf3_with_v1.0_PCTE.ipynb`, a file -called `archived_drkcfiles.txt` and a subdirectory `example/`. +called `archived_drkcfiles.txt`, a `requirements.txt` file, and a subdirectory `example/`. -**In order to run this Jupyter Notebook you must have created a virtual +**To run this Jupyter Notebook you must have created a virtual conda environment that includes `calwf3` v3.5.2.** Version 3.5.2 of `calwf3` -is available in HSTCAL release 2.5.0. To create an environment with +is available in `hstcal` release 2.5.0. To create an environment with `calwf3` v3.5.2 try this from the terminal: ``` $ conda config --add channels http://ssb.stsci.edu/astroconda - -$ conda create -n v1_PCTE hstcal==2.5.0 python=3.7 ginga stsci-hst notebook +$ conda create -n v1_PCTE hstcal==2.5.0 python=3.11 +$ conda activate v1_PCTE +$ pip install -r requirements.txt ``` -In general, users wanting to use the v1.0 pixel-based CTE correction +In most cases, users wanting to use the v1.0 pixel-based CTE correction within `calwf3` should use `calwf3` v3.5.2. This version will provide the most up-to-date calibration procedures such as time-dependent photometric corrections and zeropoints, while also including the v1.0 correction. diff --git a/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb b/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb index aafb0f594..efc08eb9e 100644 --- a/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb +++ b/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb @@ -20,24 +20,23 @@ "- Run `calwf3` `v3.5.2` to calibrate the raw image with the v1.0 pixel based CTE-correction.\n", "- Compare v1.0 and v2.0 products. \n", "\n", - "\n", - "Please make sure you have read the `README.md` file before continuing.\n", - "\n", "## Table of Contents\n", " [Introduction](#intro)
\n", " \n", " [1. Imports](#imports)
\n", - " [2. Verify `archived_drkcfiles.txt` is in CWD](#txtfile)
\n", - " [3. Check that `calwf3` Version is `v3.5.2`](#checkcalver)
\n", + " [2. Verify `archived_drkcfiles.txt` is in CWD](#txtfile)
\n", + " [3. Check that `calwf3` Version is `v3.5.2`](#checkcalver)
\n", " [4. Query MAST and Download a WFC3 `raw.fits` Image](#download)
\n", - "       [4.1 Inspect Image Headers](#imageheaders)
\n", - " [5. Find the Correct `DRKCFILE`](#finddark)
\n", - " [6 Modify Image Header](#modhead)
\n", - "       [6.1 `DRKCFILE`](#drkcfile)
\n", - "       [6.2 `PCTETAB`](#pctetab)
\n", - "[7. Re-Inspect Image Header](#reinspect)
\n", - "[8. Set Environment Variables](#envvar)
\n", - "[9. Run `calwf3`](#runcal)
\n", + " [5. Set CRDS Environment Variable and Download Reference Files](#env_var)
\n", + "       [5.1 Run `crds bestrefs`](#bestrefs)
\n", + "       [5.2 Inspect Image Header](#imageheaders)
\n", + " [6. Find the Correct v1.0 DRKCFILE](#finddark)
\n", + "       [6.1 Download the Correct DRKCFILE from CRDS](#sync_dkc)
\n", + "       [6.2 Modify the Image Header Keyword, `DRKCFILE`](#mod_dkc)
\n", + " [7. Download the v1.0 PCTETAB](#sync_pctetab)
\n", + "       [7.1 Modify the Image Header Keyword `PCTETAB`](#mod_pcte)
\n", + " [8. Re-Inspect Image Header](#reinspect)
\n", + " [9. Run `calwf3`](#runcal)
\n", "[10. Inspect `FLC` Image Header](#inspectflc)
\n", "[11. Investigate v1.0 and v2.0 Differences](#verdiffs)
\n", "      [11.1 Download the v2.0 FLC File](#downloadv2.0)
\n", @@ -50,7 +49,7 @@ "\n", "[Additional Resources](#resources)
\n", "[About the Notebook](#about)
\n", - "[Citations](#cite) \n" + "[Citations](#cite) " ] }, { @@ -61,11 +60,10 @@ "\n", "## Introduction \n", "\n", - "The v1.0 pixel-based Charge Transfer Efficiency (CTE) correction was first implemented into `calwf3` `v3.3` in 2016 ([Ryan et al. 2016](https://ui.adsabs.harvard.edu/abs/2016wfc..rept....1R/abstract),
[Anderson & Bedin 2010](https://ui.adsabs.harvard.edu/abs/2010PASP..122.1035A/abstract), [HSTCAL release notes](https://github.com/spacetelescope/hstcal/releases/tag/1.0.0)). This also marked the first time users could directly download CTE-corrected `flc & drc`
files from [MAST](https://mast.stsci.edu/search/hst/ui/#/). While the v1.0 correction was sufficient for many years, the degradation of CTE over time reduced the efficacy of the model
in treating low-level pixels. The v1.0 correction adversely impacts (overcorrects) both the image background and faint sources. In April 2021
the v2.0 pixel-based CTE correction was implemented in `calwf3` `v3.6.0` ([Anderson et al. 2021](https://ui.adsabs.harvard.edu/abs/2021wfc..rept....9A/abstract), [Kuhn & Anderson 2021](https://ui.adsabs.harvard.edu/abs/2021wfc..rept....6K/abstract), [HSTCAL release
notes](https://github.com/spacetelescope/hstcal/releases/tag/2.7.0)). Since MAST uses the latest release of `calwf3` for calibration, any WFC3/UVIS CTE corrected data retrieved from MAST, regardless of
observation date, will be calibrated with the v2.0 pixel-based CTE correction. Although v1.0 pixel-based CTE-corrected `flc & drc` files are
no longer accessable through MAST, this notebook steps through the procedure required to calibrate WFC3/UVIS images using the v1.0 CTE
correction.
\n", + "The v1.0 pixel-based Charge Transfer Efficiency (CTE) correction was first implemented into `calwf3` `v3.3` in 2016
([Ryan et al. 2016](https://ui.adsabs.harvard.edu/abs/2016wfc..rept....1R/abstract), [Anderson & Bedin 2010](https://ui.adsabs.harvard.edu/abs/2010PASP..122.1035A/abstract), [HSTCAL release notes](https://github.com/spacetelescope/hstcal/releases/tag/1.0.0)). This also marked the first time users could directly
download CTE-corrected `flc & drc` files from [MAST](https://mast.stsci.edu/search/hst/ui/#/). While the v1.0 correction was sufficient for many years, the
degradation of CTE over time reduced the efficacy of the model in treating low-level pixels. The v1.0 correction adversely
impacts (overcorrects) both the image background and faint sources. In April 2021 the v2.0 pixel-based CTE correction
was implemented in `calwf3` `v3.6.0` ([Anderson et al. 2021](https://ui.adsabs.harvard.edu/abs/2021wfc..rept....9A/abstract), [Kuhn & Anderson 2021](https://ui.adsabs.harvard.edu/abs/2021wfc..rept....6K/abstract), [HSTCAL release notes](https://github.com/spacetelescope/hstcal/releases/tag/2.7.0)). Since
MAST uses the latest release of `calwf3` for calibration, any WFC3/UVIS CTE corrected data retrieved from MAST,
regardless of observation date, will be calibrated with the v2.0 pixel-based CTE correction. Although v1.0 pixel-based
CTE-corrected `flc & drc` files are no longer accessable through MAST, this notebook steps through the procedure
required to calibrate WFC3/UVIS images using the v1.0 CTE correction.
\n", "\n", "\n", - "One of the limiting factors of using the v1.0 CTE correction are the CTE corrected dark current reference files (`DRKCFILE`). These dark
reference files are delivered to MAST by the WFC3 team and use the same pixel-based CTE correction within `calwf3`. Now that we
have switched to the v2.0 CTE correction there is a cut off for dark current reference files that use the v1.0 correction. Observations taken
after February 2021 will not have\n", - "CTE corrected dark files using the v1.0 algorithm, which means **applying the v1.0 CTE correction works
best for observations taken between May 2009 - February 2021.** If the observation being calibrated was taken after February 2021 there
are two options: **1)** use the last v1.0 CTE corrected dark reference file from February 2021 or **2)** use the v2.0 CTE corrected dark with the
most appropriate `USEAFTER` for the science exposure's observation date. " + "One of the limiting factors of using the v1.0 CTE correction are the CTE corrected dark current reference files (`DRKCFILE`).
These dark reference files are delivered to MAST by the WFC3 team and use the same pixel-based CTE correction within
`calwf3`. Now that we have switched to the v2.0 CTE correction there is a cut off for dark current reference files that use
the v1.0 correction. Observations taken after February 2021 will not have CTE corrected dark files using the v1.0 algorithm,
which means **applying the v1.0 CTE correction works best for observations taken between May 2009 - February 2021.**
If the observation being calibrated was taken after February 2021 there are two options: **1)** use the last v1.0 CTE corrected
dark reference file from February 2021 or **2)** use the v2.0 CTE corrected dark with the most appropriate `USEAFTER` for the
science exposure's observation date. " ] }, { @@ -75,27 +73,25 @@ "\n", "## 1. Imports\n", "\n", - "This notebook assumes you have created the virtual environment in [WFC3 Library's](https://github.com/spacetelescope/WFC3Library) installation instructions.\n", + "
This notebook assumes you have created and activated a virtual environment using the requirements file in this notebook's repository. Please make sure you have read the README file before continuing.
\n", "\n", "We import:
\n", - "
\n", - "**•** *glob* for creating list of files
\n", - "**•** *matplotlib.pyplot* for plotting and displaying images
\n", - "**•** *numpy* for finding indices and concatenating arrays
\n", - "**•** *os* for setting environment variables
\n", - "**•** *shutil* removing an empty directory
\n", "\n", - "**•** *astropy.io.fits* for opening and modifying fits files
\n", - "**•** *astroquery.mast.Observations* for downloading data from MAST
\n", - "**•** *astropy.table.Table* for creating and manipulating data tables
\n", - "**•** *astropy.time.Time* for converting between time formats
\n", - "**•** *ginga.util.zscale* for finding scale limits when displaying images
\n", - "**•** *photutils.aperture.aperture_photometry* for performing aperture photometry
\n", - "**•** *photutils.aperture.CircularAperture* for creating circular apertures
\n", - "**•** *photutils.aperture.CircularAnnulus* for creating circular annuli
\n", - "**•** *wfc3tools.calwf3* for verifying the version and running pipeline
\n", - "\n", - "**•** *background_median.aperture_stats_tbl* for measuring background values within annuli
" + "| Package Name | Purpose |\n", + "|:-----------------------------------------|:-------------------------------------------------------|\n", + "| `glob` | creating list of files |\n", + "| `os` | directory maintenance and setting environment variables|\n", + "| `astropy.io.fits` | opening and modifying fits files |\n", + "| `astroquery.mast.Observations` | downloading data from MAST |\n", + "| `astropy.table.Table` | creating and manipulating data tables |\n", + "| `astropy.visualization.ZScaleInterval` | finding z-scale limits when displaying images |\n", + "| `matplotlib.pyplot` | plotting and displaying images |\n", + "| `numpy` | finding indices and concatenating arrays |\n", + "| `photutils.aperture.aperture_photometry` | performing aperture photometry |\n", + "| `photutils.aperture.CircularAperture` | creating circular apertures |\n", + "| `photutils.aperture.CircularAnnulus` | creating circular annuli |\n", + "| `wfc3tools.calwf3` | verifying the version and running pipeline |\n", + "| `background_median.aperture_stats_tbl` | measuring background values within annuli |" ] }, { @@ -105,16 +101,14 @@ "outputs": [], "source": [ "import glob\n", - "import numpy as np\n", - "import matplotlib.pyplot as plt\n", "import os\n", - "import shutil\n", "\n", "from astropy.io import fits\n", - "from astroquery.mast import Observations\n", "from astropy.table import Table\n", - "from astropy.time import Time\n", - "from ginga.util import zscale\n", + "from astroquery.mast import Observations\n", + "from astropy.visualization import ZScaleInterval\n", + "import matplotlib.pyplot as plt\n", + "import numpy as np\n", "from photutils.aperture import aperture_photometry, CircularAperture, CircularAnnulus\n", "from wfc3tools import calwf3\n", "\n", @@ -127,9 +121,9 @@ "source": [ "\n", "## 2. Verify `archived_drkcfiles.txt` is in CWD\n", - "When you cloned/downloaded this notebook from [WFC3 Library](https://github.com/spacetelescope/WFC3Library), a .txt file should have been included. The file name is
`archived_drkcfiles.txt` and it is used later on in the notebook. This .txt file includes the file name, delivery date, activation
date, and USEAFTER date for every v1.0 CTE corrected dark reference file between May 2009 - February 2021. Below, we will use
this file in conjunction with the observation date of the file(s) being calibrated to pick out the most appropriate v1.0 CTE corrected
dark reference file(s).
\n", + "When you cloned/downloaded this notebook from [hst_notebooks](https://github.com/spacetelescope/hst_notebooks/), a .txt file should have been included. The file name is
`archived_drkcfiles.txt` and it is used later on in the notebook. This .txt file includes the file name, delivery date, activation
date, and USEAFTER date for every v1.0 CTE corrected dark reference file between May 2009 - February 2021. Below, we will use
this file in conjunction with the observation date of the file(s) being calibrated to pick out the most appropriate v1.0 CTE corrected
dark reference file(s).
\n", "\n", - "Please make sure the file is in the current working directory before continuing." + "Please make sure the `archived_drkcfiles.txt` file is in the current working directory before continuing." ] }, { @@ -139,7 +133,7 @@ "outputs": [], "source": [ "# list cwd to verify txt file is there\n", - "!ls -l archived_drkcfiles.txt\n" + "!ls -l archived_drkcfiles.txt" ] }, { @@ -149,10 +143,12 @@ "\n", "## 3. Check that `calwf3` Version is `v3.5.2`\n", "In April 2021, a new `calwf3` version was released that contains the v2.0 CTE-correction.\n", - "If you would like to use the v1.0 correction, your
current environment must be using `calwf3` versions equal to or between `3.3` - `3.5.2`. However, in order to get the best v1.0 calibrated
images we must use `calwf3` `v3.5.2`. This version of `calwf3` includes the recent ([~Jan 2021](https://github.com/spacetelescope/hstcal/releases/tag/2.5.0)) update that added MJD as a parameterized
variable for the `PHOTMODE` keyword, which enables a time-dependent photometric correction and zeropoint. If your version is `3.6.0` or higher,
you must downgrade the [`hstcal` package](https://github.com/spacetelescope/hstcal). The safer option, however, is to create a new environment such as:
\n", + "If you would like to use the v1.0 correction, your
current environment must be using `calwf3` versions equal to or between `3.3` - `3.5.2`. However, in order to get the best v1.0 calibrated
images we must use `calwf3` `v3.5.2`. This version of `calwf3` includes the recent ([~Jan 2021](https://github.com/spacetelescope/hstcal/releases/tag/2.5.0)) update that added MJD as a parameterized
variable for the `PHOTMODE` keyword, which enables a time-dependent photometric correction and zeropoint. If your version is `3.6.0` or higher,
you must downgrade the [`hstcal` package](https://github.com/spacetelescope/hstcal). The safer option, however, is to create a new environment using the requirements file provided in
the [notebook's repository](https://github.com/spacetelescope/hst_notebooks/tree/main/notebooks/WFC3/calwf3_v1.0_cte):
\n", "\n", "* `$ conda config --add channels http://ssb.stsci.edu/astroconda`\n", - "* `$ conda create -n v1_PCTE hstcal==2.5.0 python=3.7 ginga stsci-hst notebook`\n", + "* `$ conda create -n v1_PCTE hstcal==2.5.0 python=3.11`\n", + "* `$ conda activate v1_PCTE`\n", + "* `$ pip install -r requirements.txt`\n", "\n", "`hstcal` `v2.5.0` provides version `3.5.2` of `calwf3`, which is the last version that offers the v1.0 pixel-based CTE correction.\n" ] @@ -189,7 +185,7 @@ "source": [ "# Edit this cell's first line if you would to download your own file(s)\n", "# Get the observation records\n", - "obs_table = Observations.query_criteria(obs_id='idv404axq*',proposal_id=15576)\n", + "obs_table = Observations.query_criteria(obs_id='idv404axq*', proposal_id=15576)\n", "\n", "# Get the listing of data products\n", "products = Observations.get_product_list(obs_table)\n", @@ -207,8 +203,7 @@ " os.rmdir('mastDownload/HST/'+filename[:9])\n", " \n", "os.rmdir('mastDownload/HST/')\n", - "os.rmdir('mastDownload/')\n", - " " + "os.rmdir('mastDownload/')" ] }, { @@ -218,7 +213,65 @@ "outputs": [], "source": [ "# show list of current dir to verify fits file is there\n", - "!ls -l *raw.fits\n" + "!ls -l *raw.fits" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "## 5. Set CRDS Environment Variable and Download Reference Files" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "
If you already have the absolute paths set for CRDS, please skip the code cell immediately below and proceed to the crds bestrefs command in Section 5.1.
\n", + "\n", + "Before we run `crds bestfefs` and `calwf3`, we need to [set environment variables](https://hst-crds.stsci.edu/docs/cmdline_bestrefs/) for several subsequent calibration tasks. We will point to a
subdirectory within the main `crds_cache/` using the `IREF` environment variable. The `IREF` variable is used for WFC3 reference files. Other
instruments use other variables, e.g., `JREF` for ACS. You have the option to permanently add these environment variables to your user profile by
adding the path in your shell's configuration file. If you're using bash, you would edit the `~/.bash_profile` file with lines such as:\n", + "\n", + "* `export CRDS_SERVER_URL=\"https://hst-crds.stsci.edu\"`\n", + "* `export CRDS_SERVER=\"https://hst-crds.stsci.edu\"`\n", + "* `export CRDS_PATH=\"$HOME/crds_cache\"`\n", + "* `export iref=\"${CRDS_PATH}/references/hst/wfc3/\"`" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "os.environ['CRDS_SERVER_URL'] = 'https://hst-crds.stsci.edu'\n", + "os.environ['CRDS_SERVER'] = 'https://hst-crds.stsci.edu'\n", + "os.environ['CRDS_PATH'] = 'crds_cache'\n", + "os.environ['iref'] = 'crds_cache/references/hst/wfc3/'" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "## 5.1 Run `crds bestrefs` \n", + "\n", + "The cell below calls [CRDS bestref](https://hst-crds.stsci.edu/static/users_guide/basic_use.html), which will copy the necessary reference files from CRDS over to your local machine, if you do not already have
them. Without running this command we would not be able to calibrate the image with `calwf3`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "bestref_input = 'crds bestrefs --update-bestrefs --sync-references=1 --files idv404axq_raw.fits'\n", + "run_bestref = os.system(bestref_input)\n", + "if run_bestref != 0:\n", + " print(f\"bestref failed with exit code: {run_bestref}\")" ] }, { @@ -226,7 +279,7 @@ "metadata": {}, "source": [ "\n", - "## 4.1 Inspect Image Header\n", + "## 5.2 Inspect Image Header\n", "When processing a raw file through `calwf3` , the pipeline uses a few different header keywords to initiate and run the pixel-based CTE
correction. Here, we inspect the important header keywords from the raw file just downloaded.
\n", "At this step, you should see:
\n", "- `pctetab` set to `iref$54l1347ei_cte.fits`
\n", @@ -241,7 +294,7 @@ "outputs": [], "source": [ "# Collect header keyword info from raw file\n", - "file, date, expstart, pctetab, drkcfile, pctecorr = [],[],[],[],[],[]\n", + "file, date, expstart, pctetab, drkcfile, pctecorr = [], [], [], [], [], []\n", "for f in glob.glob('*raw.fits'):\n", " h = fits.getheader(f)\n", " file.append(h['filename'])\n", @@ -252,7 +305,7 @@ " pctecorr.append(h['pctecorr'])\n", "\n", "image_table = Table([file, date, expstart, pctetab, drkcfile, pctecorr],\n", - " names=('file','date-obs', 'expstart', 'pctetab', 'drkcfile', 'pctecorr'))\n", + " names=('file', 'date-obs', 'expstart', 'pctetab', 'drkcfile', 'pctecorr'))\n", "image_table['expstart'].format = '5.6f'\n", "\n", "# Sort and display the table\n", @@ -265,7 +318,7 @@ "metadata": {}, "source": [ "\n", - "## 5. Find the Correct `DRKCFILE` \n", + "## 6. Find the Correct v1.0 `DRKCFILE` \n", "Below, we open the .txt file containing a list of all the `DRKCFILE` reference files created with the v1.0 pixel-based CTE correction.
`DRKCFILE` reference files are CTE corrected files used by the pipeline to perform the dark current subtraction during the generation
of the `flc` file. The `DRKCFILE` files listed in `archived_drkcfiles.txt` have been archived on the CRDS database and while
they are still accessible for use and download, they are not being actively used by MAST.
\n", "\n", "In the first cell, we generate an `astropy.Table` ( `drkc_table` ) using the data from the file `archived_drkcfiles.txt`, mentioned
in [Section 2](#txtfile), and create empty lists for the final table. Then, in the second cell, we index the `drkc_table` table for the best `DRKCFILE`
that corresponds to the `DATE-OBS` of the `raw` file being calibrated. Lastly, in the third cell, we create and display the final `astropy.table`
that contains just the necessary `DRKCFILE`.
\n", @@ -279,10 +332,10 @@ "outputs": [], "source": [ "# Generate astropy table from `archived_drkcfiles.txt`\n", - "drkc_table = Table.read('archived_drkcfiles.txt',format='ascii.commented_header')\n", + "drkc_table = Table.read('archived_drkcfiles.txt', format='ascii.commented_header')\n", "\n", "# Create empty lists for final astropy table\n", - "rawfiles, obsdates, dkcfiles, uafters, active_dates = [],[],[],[], []\n" + "rawfiles, obsdates, dkcfiles, uafters, active_dates = [], [], [], [], []" ] }, { @@ -295,18 +348,17 @@ "for expstart in image_table['expstart']:\n", " table_idx = np.where(abs(drkc_table['useafter-mjd']-expstart) == abs(drkc_table['useafter-mjd']-expstart).min())[0][0]\n", "\n", - " rawfile = image_table[image_table['expstart']==expstart]['file'][0]\n", + " rawfile = image_table[image_table['expstart'] == expstart]['file'][0]\n", " \n", " # if drkcfile has useafter date > rawfile expstart use previous drkcfile\n", - " if drkc_table[table_idx]['useafter-mjd'] > image_table[image_table['file']==rawfile]['expstart'][0]:\n", + " if drkc_table[table_idx]['useafter-mjd'] > image_table[image_table['file'] == rawfile]['expstart'][0]:\n", " table_idx -= 1\n", - " #append info\n", + " # append info\n", " rawfiles.append(rawfile)\n", - " obsdates.append(image_table[image_table['file']==rawfile]['date-obs'][0])\n", + " obsdates.append(image_table[image_table['file'] == rawfile]['date-obs'][0])\n", " dkcfiles.append(drkc_table[table_idx]['drkcfile'])\n", " uafters.append(drkc_table[table_idx]['useafter'])\n", - " active_dates.append(drkc_table[table_idx]['activation-date'])\n", - " " + " active_dates.append(drkc_table[table_idx]['activation-date'])" ] }, { @@ -317,22 +369,40 @@ "source": [ "# Generate table of filename, date-obs, drkc-filename, corresponding useafter\n", "raw_dkc_tab = Table([rawfiles, obsdates, dkcfiles, uafters, active_dates],\n", - " names=('filename','date-obs','dkc-filename','dkc-useafter','dkc-activation'))\n", + " names=('filename', 'date-obs', 'dkc-filename', 'dkc-useafter', 'dkc-activation'))\n", "# Display table\n", - "raw_dkc_tab\n" + "raw_dkc_tab" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "\n", - "## 6. Modify Image Header\n", - "Now that we know which v1.0 CTE corrected dark current reference file corresponds to our `raw` science file, we're ready to edit the header
keywords the v1.0 CTE corrected reference files and tables. \n", - "\n", - "\n", - "## 6.1 `DRKCFILE`\n", - "First, we will edit `raw` file's header with the v1.0 CTE corrected dark current reference file, `DRKCFILE`, that we just found in Section 6. \n" + "\n", + "## 6.1 Download the Correct `DRKCFILE` from CRDS\n", + "Now that we know the name of the correct `DRKCFILE`, it must be retrieved from CRDS and stored on your local machine so that it can be used
during calibration. To copy the file from CRDS we use the [crds sync](https://hst-crds.stsci.edu/static/users_guide/command_line_tools.html#crds-sync) command. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "for dkc in raw_dkc_tab['dkc-filename']:\n", + " crds_sync = f\"crds sync --hst --files {dkc} --output-dir {os.environ['iref']} \"\n", + " run_sync = os.system(crds_sync)\n", + " if run_sync != 0:\n", + " print(f\"crds sync failed with exit code: {run_sync}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "## 6.2 Modify the Image Header Keyword, `DRKCFILE`\n", + "With the v1.0 CTE-corrected dark current reference file that corresponds to our `raw` science file copied to our local machine, we're ready to
edit the header keyword with the proper `DRKCFILE`. " ] }, { @@ -341,19 +411,20 @@ "metadata": {}, "outputs": [], "source": [ - " for file in glob.glob('i*raw.fits'):\n", + "for file in glob.glob('i*raw.fits'):\n", " # Using raw_dkc_tab from above, grab appropriate drkcfile\n", - " ctecorr_dark = 'iref$'+raw_dkc_tab[raw_dkc_tab['filename']==file]['dkc-filename'][0]\n", - " fits.setval(file, 'DRKCFILE', value = ctecorr_dark)\n" + " ctecorr_dark = 'iref$'+raw_dkc_tab[raw_dkc_tab['filename'] == file]['dkc-filename'][0]\n", + " fits.setval(file, 'DRKCFILE', value=ctecorr_dark)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "\n", - "## 6.2 `PCTETAB`\n", - "The next keyword we're going to modify is the `PCTETAB`. This is the pixel-based correction reference table and without it the algorithm will
not work. In order to use the v1.0 pixel based correction, `PCTETAB` will need to be set to the proper reference file. In the below cell, we set
the `raw` file's `PCTETAB` to the v1.0 reference table, [zcv2057mi_cte.fits](https://hst-crds.stsci.edu/browse/zcv2057mi_cte.fits).\n" + "\n", + "## 7. Download the v1.0 `PCTETAB`\n", + "\n", + "The next reference file we're going to download from CRDS is the `PCTETAB`. This is the pixel-based correction reference table and without it
the algorithm will not work. In order to use the v1.0 pixel based correction, we must retrieve the v1.0 `PCTETAB` and set the header keyword to
the proper reference file. In the cells below, we use the `crds sync` command again and then set the `raw` file's `PCTETAB` to the v1.0
reference table, [zcv2057mi_cte.fits](https://hst-crds.stsci.edu/browse/zcv2057mi_cte.fits).\n" ] }, { @@ -362,7 +433,27 @@ "metadata": {}, "outputs": [], "source": [ - "fits.setval('idv404axq_raw.fits', 'PCTETAB', value = 'iref$zcv2057mi_cte.fits') \n" + "crds_sync = f\"crds sync --hst --files zcv2057mi_cte.fits --output-dir {os.environ['iref']}\"\n", + "run_sync = os.system(crds_sync)\n", + "if run_sync != 0:\n", + " print(f\"crds sync failed: {run_sync}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "## 7.1 Modify the Image Header Keyword, `PCTETAB`" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "fits.setval('idv404axq_raw.fits', 'PCTETAB', value='iref$zcv2057mi_cte.fits') " ] }, { @@ -370,9 +461,9 @@ "metadata": {}, "source": [ "\n", - "## 7. Re-Inspect Image Header \n", + "## 8. Re-Inspect Image Header \n", "Now with the headers modified, we inspect the keywords one last time to verify the file was updated properly before we process
it through `calwf3`. At this point you should see:
\n", - "- `PCTETAB` set to `iref$zcv2057mi_cte.fits\t`
\n", + "- `PCTETAB` set to `iref$zcv2057mi_cte.fits`
\n", "- `DRKCFILE` set to `iref$3961719li_dkc.fits`
" ] }, @@ -383,7 +474,7 @@ "outputs": [], "source": [ "# Recollect and display header keywords\n", - "file, date, expstart, pctetab, drkcfile, pctecorr = [],[],[],[],[],[]\n", + "file, date, expstart, pctetab, drkcfile, pctecorr = [], [], [], [], [], []\n", "for f in glob.glob('*raw.fits'):\n", " h = fits.getheader(f)\n", " file.append(h['filename'])\n", @@ -394,7 +485,7 @@ " pctecorr.append(h['pctecorr'])\n", "\n", "updated_table = Table([file, date, expstart, pctetab, drkcfile, pctecorr],\n", - " names=('file','date-obs', 'expstart', 'pctetab', 'drkcfile', 'pctecorr'))\n", + " names=('file', 'date-obs', 'expstart', 'pctetab', 'drkcfile', 'pctecorr'))\n", "updated_table['expstart'].format = '5.6f'\n", "\n", "# Sort and display the table\n", @@ -402,35 +493,6 @@ "updated_table" ] }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "\n", - "## 8. Set Environment Variables\n", - "
If you already have the absolute paths set for CRDS, please skip this step and proceed to Section 9.
\n", - " \n", - "Before we run `calwf3`, we need to [set environment variables](https://hst-crds.stsci.edu/docs/cmdline_bestrefs/) for several subsequent calibration tasks. We will point to a subdirectory called
`crds_cache/` using the `IREF` environment variable. The `IREF` variable is used for WFC3 reference files. Other instruments use other
variables, e.g., `JREF` for ACS. You have the option to permanently add these environment variables to your user profile by adding the path
in your shell's configuration file. If you're using bash, you would edit the `~/.bash_profile` file with lines such as:\n", - "\n", - "* `export CRDS_PATH=\"$HOME/crds_cache\"`\n", - "* `export CRDS_SERVER_URL=\"https://hst-crds.stsci.edu\"`\n", - "* `export iref=\"${CRDS_PATH}/references/hst/iref/\"`\n", - "\n", - "\n" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "os.environ['CRDS_SERVER_URL'] = 'https://hst-crds.stsci.edu'\n", - "os.environ['CRDS_SERVER'] = 'https://hst-crds.stsci.edu'\n", - "os.environ['CRDS_PATH'] = '~/crds_cache'\n", - "os.environ['iref'] = '~/crds_cache/references/hst/iref/'\n" - ] - }, { "cell_type": "markdown", "metadata": {}, @@ -448,7 +510,8 @@ }, "outputs": [], "source": [ - "calwf3('idv404axq_raw.fits')\n" + "if not os.path.exists('idv404axq_flt.fits'):\n", + " calwf3('idv404axq_raw.fits')" ] }, { @@ -458,7 +521,7 @@ "outputs": [], "source": [ "# show list of cwd to verify calibrated files were made\n", - "!ls -ltr *.fits\n" + "!ls -ltr *.fits" ] }, { @@ -476,8 +539,8 @@ "metadata": {}, "outputs": [], "source": [ - "#Recollect and display FLC header keywords\n", - "file, pctetab, drkcfile, pctecorr, calver, ctename, ctever = [],[],[],[],[],[],[]\n", + "# Recollect and display FLC header keywords\n", + "file, pctetab, drkcfile, pctecorr, calver, ctename, ctever = [], [], [], [], [], [], []\n", "for f in glob.glob('*flc.fits'):\n", " h = fits.getheader(f)\n", " file.append(h['filename'])\n", @@ -489,7 +552,7 @@ " ctever.append(h['cte_ver'])\n", "\n", "final_table = Table([file, pctetab, drkcfile, pctecorr, calver, ctename, ctever],\n", - " names=('file','pctetab', 'drkcfile', 'pctecorr','cal_ver','cte_name','cte_ver'))\n", + " names=('file', 'pctetab', 'drkcfile', 'pctecorr', 'cal_ver', 'cte_name', 'cte_ver'))\n", "\n", "final_table" ] @@ -504,7 +567,7 @@ "\n", "\n", "## 11.1 Download the v2.0 FLC File\n", - "First, we need to download the same FLC file from MAST that is corrected with the v2.0 pixel-based CTE correction so that we can
compare it to the v1.0 FLC file we just created in the notebook." + "First, we need to download the same FLC file from MAST that is corrected with the v2.0 pixel-based CTE correction so that we can
compare it to the v1.0 FLC file we just created in the notebook. In this step we rename the downloaded FLC to `idv404axq_v2.0_flc.fits`." ] }, { @@ -514,13 +577,13 @@ "outputs": [], "source": [ "# Get the observation records\n", - "obs_table = Observations.query_criteria(obs_id='idv404axq*',proposal_id=15576)\n", + "obs_table = Observations.query_criteria(obs_id='idv404axq*', proposal_id=15576)\n", "\n", "# Get the listing of data products\n", "products = Observations.get_product_list(obs_table)\n", "\n", "# Filter the products for the RAW files\n", - "filtered_products = Observations.filter_products(products, productSubGroupDescription='FLC',project='CALWF3')\n", + "filtered_products = Observations.filter_products(products, productSubGroupDescription='FLC', project='CALWF3')\n", "\n", "# Download all the images above\n", "download_table = Observations.download_products(filtered_products, mrp_only=False)\n", @@ -566,9 +629,9 @@ "PAM_uvis2 = fits.getdata('example/UVIS2wfc3_map.fits')\n", "\n", "# Stich UVIS1 and 2 together and multiply by pixel area map\n", - "v2sci = np.concatenate([v2_uvis2*PAM_uvis2 ,v2_uvis1*PAM_uvis1])\n", - "v1sci = np.concatenate([v1_uvis2*PAM_uvis2 ,v1_uvis1*PAM_uvis1])\n", - "fltsci = np.concatenate([flt_uvis2*PAM_uvis2,flt_uvis1*PAM_uvis1])\n" + "v2sci = np.concatenate([v2_uvis2*PAM_uvis2, v2_uvis1*PAM_uvis1])\n", + "v1sci = np.concatenate([v1_uvis2*PAM_uvis2, v1_uvis1*PAM_uvis1])\n", + "fltsci = np.concatenate([flt_uvis2*PAM_uvis2, flt_uvis1*PAM_uvis1])" ] }, { @@ -587,30 +650,31 @@ "outputs": [], "source": [ "# Generate subplots\n", - "fig, [ax1,ax2,ax3] = plt.subplots(1,3,figsize=(15,10),dpi=150)\n", + "fig, [ax1, ax2, ax3] = plt.subplots(1, 3, figsize=(15, 10), dpi=150)\n", "\n", "# Generate background subsections\n", - "flt_bkg = fltsci[2070:2120,2180:2230]\n", - "v1_bkg = v1sci[2070:2120,2180:2230]\n", - "v2_bkg = v2sci[2070:2120,2180:2230]\n", + "flt_bkg = fltsci[2070:2120, 2180:2230]\n", + "v1_bkg = v1sci[2070:2120, 2180:2230]\n", + "v2_bkg = v2sci[2070:2120, 2180:2230]\n", "\n", "# Calculate min and max values for image scaling \n", - "v1z1,v1z2 = zscale.zscale(v1_bkg)\n", + "z = ZScaleInterval()\n", + "z1, z2 = z.get_limits(v1_bkg)\n", "\n", "# Display background subsection\n", - "im1 = ax1.imshow(flt_bkg,origin='lower',cmap='Greys_r',vmin=v1z1,vmax=v1z2)\n", - "im2 = ax2.imshow(v1_bkg,origin='lower',cmap='Greys_r',vmin=v1z1,vmax=v1z2)\n", - "im3 = ax3.imshow(v2_bkg,origin='lower',cmap='Greys_r',vmin=v1z1,vmax=v1z2)\n", + "im1 = ax1.imshow(flt_bkg, origin='lower', cmap='Greys_r', vmin=z1, vmax=z2)\n", + "im2 = ax2.imshow(v1_bkg, origin='lower', cmap='Greys_r', vmin=z1, vmax=z2)\n", + "im3 = ax3.imshow(v2_bkg, origin='lower', cmap='Greys_r', vmin=z1, vmax=z2)\n", "\n", "# Formatting\n", - "fig.colorbar(im1,ax=ax1,shrink=0.35,pad=0.02)\n", - "fig.colorbar(im2,ax=ax2,shrink=0.35,pad=0.02)\n", - "fig.colorbar(im3,ax=ax3,shrink=0.35,pad=0.02)\n", - "ax1.set_title('FLT File BKG Subsection',size=14)\n", - "ax2.set_title('v1.0 PCTE FLC BKG Subsection',size=14)\n", - "ax3.set_title('v2.0 PCTE FLC BKG Subsection',size=14)\n", - "ax1.axis('off'),ax2.axis('off'),ax3.axis('off')\n", - "fig.tight_layout()\n" + "fig.colorbar(im1, ax=ax1, shrink=0.35, pad=0.02)\n", + "fig.colorbar(im2, ax=ax2, shrink=0.35, pad=0.02)\n", + "fig.colorbar(im3, ax=ax3, shrink=0.35, pad=0.02)\n", + "ax1.set_title('FLT File BKG Subsection', size=14)\n", + "ax2.set_title('v1.0 PCTE FLC BKG Subsection', size=14)\n", + "ax3.set_title('v2.0 PCTE FLC BKG Subsection', size=14)\n", + "ax1.axis('off'), ax2.axis('off'), ax3.axis('off')\n", + "fig.tight_layout()" ] }, { @@ -619,8 +683,7 @@ "source": [ "
Animated GIF of the v1.0 and v2.0 FLC image subsections:
\n", "\n", - "\n", - "\n" + "\"An\n" ] }, { @@ -639,31 +702,30 @@ "outputs": [], "source": [ "# Generate subplots\n", - "fig, [ax1,ax2] = plt.subplots(2,1,figsize=(7,10),dpi=120)\n", + "fig, [ax1, ax2] = plt.subplots(2, 1, figsize=(7, 10), dpi=120)\n", "\n", "# Plot background subsection histograms\n", - "ax1.hist(flt_bkg.ravel(),bins=100,range=(-30,100),histtype='step',color='C3',label='FLT')\n", - "ax1.hist(v1_bkg.ravel(),bins=100,range=(-30,100),histtype='step',color='C0',label='v1.0 FLC')\n", - "ax1.hist(v2_bkg.ravel(),bins=100,range=(-30,100),histtype='step',color='k',label='v2.0 FLC')\n", + "ax1.hist(flt_bkg.ravel(), bins=100, range=(-30, 100), histtype='step', color='C3', label='FLT')\n", + "ax1.hist(v1_bkg.ravel(), bins=100, range=(-30, 100), histtype='step', color='C0', label='v1.0 FLC')\n", + "ax1.hist(v2_bkg.ravel(), bins=100, range=(-30, 100), histtype='step', color='k', label='v2.0 FLC')\n", "\n", "# Plot background subsection differential histograms\n", - "ax2.hist((v1_bkg-flt_bkg).ravel(),bins=100,range=(-25,50),histtype='step',color='magenta',label='v1.0 FLC $-$ FLT')\n", - "ax2.hist((v2_bkg-flt_bkg).ravel(),bins=100,range=(-25,50),histtype='step',color='limegreen',label='v2.0 FLC $-$ FLT')\n", - "ax2.hist((v1_bkg-v2_bkg).ravel(),bins=100,range=(-25,50),histtype='step',color='C9',label='v1.0 $-$ v2.0 FLC')\n", + "ax2.hist((v1_bkg-flt_bkg).ravel(), bins=100, range=(-25, 50), histtype='step', color='magenta', label='v1.0 FLC $-$ FLT')\n", + "ax2.hist((v2_bkg-flt_bkg).ravel(), bins=100, range=(-25, 50), histtype='step', color='limegreen', label='v2.0 FLC $-$ FLT')\n", + "ax2.hist((v1_bkg-v2_bkg).ravel(), bins=100, range=(-25, 50), histtype='step', color='C9', label='v1.0 $-$ v2.0 FLC')\n", "\n", "# Formatting\n", - "ax1.set_title('Background Subsection Histogram',size=14)\n", - "ax2.set_title('Background Subsection Differential Histogram',size=14)\n", - "ax1.set_xlabel('Pixel Value [e-]',size=12)\n", - "ax1.set_ylabel('Frequency',size=12)\n", - "ax2.set_xlabel('Pixel Value [e-]',size=12)\n", - "ax2.set_ylabel('Frequency',size=12)\n", - "ax1.grid(alpha=0.5),ax2.grid(alpha=0.5)\n", + "ax1.set_title('Background Subsection Histogram', size=14)\n", + "ax2.set_title('Background Subsection Differential Histogram', size=14)\n", + "ax1.set_xlabel('Pixel Value [e-]', size=12)\n", + "ax1.set_ylabel('Frequency', size=12)\n", + "ax2.set_xlabel('Pixel Value [e-]', size=12)\n", + "ax2.set_ylabel('Frequency', size=12)\n", + "ax1.grid(alpha=0.5), ax2.grid(alpha=0.5)\n", "ax1.legend(), ax2.legend()\n", "ax1.set_yscale('log')\n", "ax2.set_yscale('log')\n", - "\n", - "fig.tight_layout()\n" + "fig.tight_layout()" ] }, { @@ -684,23 +746,24 @@ "outputs": [], "source": [ "# Generate subplots\n", - "fig, [ax1,ax2] = plt.subplots(2,1,figsize=(8,8),dpi=150)\n", + "fig, [ax1, ax2] = plt.subplots(2, 1, figsize=(8, 8), dpi=150)\n", "\n", "# Calculate min and max values for image scaling \n", - "v1z1,v1z2 = zscale.zscale(v1_uvis1)\n", + "z = ZScaleInterval()\n", + "z1, z2 = z.get_limits(v1_uvis1)\n", "\n", "# Display subsection\n", - "im1 = ax1.imshow(v1_uvis1,origin='lower',cmap='Greys_r',vmin=v1z1,vmax=v1z2)\n", - "im2 = ax2.imshow(v2_uvis1,origin='lower',cmap='Greys_r',vmin=v1z1,vmax=v1z2)\n", + "im1 = ax1.imshow(v1_uvis1, origin='lower', cmap='Greys_r', vmin=z1, vmax=z2)\n", + "im2 = ax2.imshow(v2_uvis1, origin='lower', cmap='Greys_r', vmin=z1, vmax=z2)\n", "\n", "# Formatting\n", - "ax1.set_xlim(85,325),ax2.set_xlim(85,325)\n", - "ax1.set_ylim(0,149),ax2.set_ylim(0,149)\n", - "fig.colorbar(im1,ax=ax1,shrink=0.95,pad=0.01)\n", - "fig.colorbar(im2,ax=ax2,shrink=0.95,pad=0.01)\n", - "ax1.set_title('v1.0 PCTE FLC UVIS1 Subsection',size=14)\n", - "ax2.set_title('v2.0 PCTE FLC UVIS1 Subsection',size=14)\n", - "fig.tight_layout()\n" + "ax1.set_xlim(85, 325), ax2.set_xlim(85, 325)\n", + "ax1.set_ylim(0, 149), ax2.set_ylim(0, 149)\n", + "fig.colorbar(im1, ax=ax1, shrink=0.95, pad=0.01)\n", + "fig.colorbar(im2, ax=ax2, shrink=0.95, pad=0.01)\n", + "ax1.set_title('v1.0 PCTE FLC UVIS1 Subsection', size=14)\n", + "ax2.set_title('v2.0 PCTE FLC UVIS1 Subsection', size=14)\n", + "fig.tight_layout()" ] }, { @@ -742,7 +805,7 @@ "metadata": {}, "outputs": [], "source": [ - "def get_flux(data,aperture,annulus_aperture):\n", + "def get_flux(data, aperture, annulus_aperture):\n", " \"\"\"\n", " Function to calculate background subtracted aperture sum \n", " \n", @@ -765,20 +828,21 @@ " phot = aperture_photometry(data, aperture)\n", " \n", " # Measure background around sources. aperture_stats_tbl() comes from background_median.py\n", - " bkg_phot = aperture_stats_tbl(data, annulus_aperture, method = 'exact', sigma_clip = True)\n", + " bkg_phot = aperture_stats_tbl(data, annulus_aperture, method='exact', sigma_clip=True)\n", " \n", " # Calculate background subtracted aperture sum\n", " flux = phot['aperture_sum'] - bkg_phot['aperture_median'] * aperture.area\n", " \n", " return flux\n", "\n", + "\n", "# Approximate x,y pixel locations of each star in the 4Kx4K array\n", - "positions = [(299.4,2135.6),\n", - " (114.7,2093.4),\n", - " (171.3,2074.9),\n", - " (262.6,2164.9),\n", - " (289.1,2085.6),\n", - " (204.8,2073.1)]\n", + "positions = [(299.4, 2135.6),\n", + " (114.7, 2093.4),\n", + " (171.3, 2074.9),\n", + " (262.6, 2164.9),\n", + " (289.1, 2085.6),\n", + " (204.8, 2073.1)]\n", "\n", "# Photutils cirular aperture object with small radius\n", "aperture = CircularAperture(positions, r=3)\n", @@ -788,9 +852,9 @@ "\n", "# Call function to calculate flux of stars\n", "# THE RUNTIME WARNING MAY BE IGNORED\n", - "fltflux = get_flux(fltsci,aperture,annulus_aperture)\n", - "v1flux = get_flux(v1sci,aperture,annulus_aperture)\n", - "v2flux = get_flux(v2sci,aperture,annulus_aperture)\n" + "fltflux = get_flux(fltsci, aperture, annulus_aperture)\n", + "v1flux = get_flux(v1sci, aperture, annulus_aperture)\n", + "v2flux = get_flux(v2sci, aperture, annulus_aperture)" ] }, { @@ -819,33 +883,33 @@ "outputs": [], "source": [ "# Generate subplots\n", - "fig, [ax1,ax2] = plt.subplots(2,1,figsize=(8,13),dpi=120)\n", - "ax1.grid(alpha=0.5,which='both'),ax2.grid(alpha=0.5)\n", + "fig, [ax1, ax2] = plt.subplots(2, 1, figsize=(8, 13), dpi=120)\n", + "ax1.grid(alpha=0.5), ax2.grid(alpha=0.5)\n", "\n", "# Find median flux values between products\n", - "medflux = np.median([fltflux,v1flux,v2flux],axis=0)\n", + "medflux = np.median([fltflux, v1flux, v2flux], axis=0)\n", "\n", "# Scatter plot of measured flux\n", - "ax1.scatter(medflux,fltflux,25,marker='o',c='C3',label='FLT')\n", - "ax1.scatter(medflux,v1flux,30,marker='^',c='C0',label='v1.0 FLC')\n", - "ax1.scatter(medflux,v2flux,45,marker='*',c='k',label='v2.0 FLC')\n", + "ax1.scatter(medflux, fltflux, 25, marker='o', c='C3', label='FLT')\n", + "ax1.scatter(medflux, v1flux, 30, marker='^', c='C0', label='v1.0 FLC')\n", + "ax1.scatter(medflux, v2flux, 45, marker='*', c='k', label='v2.0 FLC')\n", "\n", "# Scatter plot of percentage difference\n", - "ax2.scatter(medflux,abs((fltflux-v1flux))/((fltflux+v1flux)/2)*100,30,\n", - " marker='^',c='magenta',label=r'$\\frac{|FLT - v1.0|}{(FLT + v1.0) ÷ 2}$')\n", - "ax2.scatter(medflux,abs((fltflux-v2flux))/((fltflux+v2flux)/2)*100,45,\n", - " marker='*',c='limegreen',label=r'$\\frac{|FLT - v2.0|}{(FLT + v2.0) ÷ 2}$')\n", - "ax2.scatter(medflux,abs((v1flux-v2flux))/((v1flux+v2flux)/2)*100,25,\n", - " marker='s',c='C9',label=r'$\\frac{|v1.0 - v2.0|}{(v1.0 + v2.0) ÷ 2}$')\n", + "ax2.scatter(medflux, abs((fltflux-v1flux))/((fltflux+v1flux)/2)*100, 30,\n", + " marker='^', c='magenta', label=r'$\\frac{|FLT - v1.0|}{(FLT + v1.0) ÷ 2}$')\n", + "ax2.scatter(medflux, abs((fltflux-v2flux))/((fltflux+v2flux)/2)*100, 45,\n", + " marker='*', c='limegreen', label=r'$\\frac{|FLT - v2.0|}{(FLT + v2.0) ÷ 2}$')\n", + "ax2.scatter(medflux, abs((v1flux-v2flux))/((v1flux+v2flux)/2)*100, 25,\n", + " marker='s', c='C9', label=r'$\\frac{|v1.0 - v2.0|}{(v1.0 + v2.0) ÷ 2}$')\n", "\n", "# Formatting \n", - "ax1.set_title('Measured Flux Within 3-pix Radius Aperture',size=14)\n", - "ax1.set_xlabel('Median Flux [e-]',size=12)\n", - "ax1.set_ylabel('Flux [e-]',size=12)\n", - "ax2.set_xlabel('Median Flux [e-]',size=12)\n", - "ax2.set_ylabel('Percent Difference [%]',size=12)\n", - "ax1.legend(prop={'size':11}),ax2.legend(prop={'size':15})\n", - "ax1.set_yscale('log')\n" + "ax1.set_title('Measured Flux Within 3-pix Radius Aperture', size=14)\n", + "ax1.set_xlabel('Median Flux [e-]', size=12)\n", + "ax1.set_ylabel('Flux [e-]', size=12)\n", + "ax2.set_xlabel('Median Flux [e-]', size=12)\n", + "ax2.set_ylabel('Percent Difference [%]', size=12)\n", + "ax1.legend(prop={'size': 11}), ax2.legend(prop={'size': 15})\n", + "ax1.set_yscale('log')" ] }, { @@ -880,20 +944,20 @@ "- [WFC3 Website](https://www.stsci.edu/hst/instrumentation/wfc3)\n", "- [WFC3 Instrument Handbook](https://hst-docs.stsci.edu/wfc3ihb)\n", "- [WFC3 Data Handbook](https://hst-docs.stsci.edu/wfc3dhb)\n", - "- [STScI Jupyter-notebooks](https://github.com/spacetelescope/notebooks/tree/master/notebooks)\n", + "- [STScI Jupyter-notebooks](https://github.com/spacetelescope/hst_notebooks)\n", "- [STScI Astroconda Channel](http://ssb.stsci.edu/astroconda)\n", "\n", "\n", "## About this Notebook\n", "\n", - "**Author:** Benjamin Kuhn; WFC3 Instrument Team\n", - "\n", - "**Updated on:** January 19, 2023\n", + "**Author:** Benjamin Kuhn; WFC3 Instrument Team
\n", + "**Created:** February 22, 2022
\n", + "**Last updated on:** November 15, 2023\n", "\n", "\n", "## Citations\n", "\n", - "If you use `numpy`, `astropy`, `astroquery`, `matplotlib`, or photutils for published research, please cite the authors.
\n", + "If you use `astropy`, `astroquery`, `numpy`, `matplotlib`, `photutils`, or `scipy` for published research, please cite the authors.
\n", "Follow these links for more information about citing the libraries below:\n", "\n", "* [Citing `astropy`](https://www.astropy.org/acknowledging.html)\n", @@ -901,6 +965,7 @@ "* [Citing `matplotlib`](https://matplotlib.org/stable/users/project/citing.html)\n", "* [Citing `numpy`](https://numpy.org/citing-numpy/)\n", "* [Citing `photutils`](https://photutils.readthedocs.io/en/stable/citation.html)\n", + "* [Citing `scipy`](https://scipy.org/citing-scipy/)\n", "
\n", "***\n", "[Top of Page](#top)\n", @@ -924,7 +989,36 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.7.16" + "version": "3.11.0" + }, + "varInspector": { + "cols": { + "lenName": 16, + "lenType": 16, + "lenVar": 40 + }, + "kernels_config": { + "python": { + "delete_cmd_postfix": "", + "delete_cmd_prefix": "del ", + "library": "var_list.py", + "varRefreshCmd": "print(var_dic_list())" + }, + "r": { + "delete_cmd_postfix": ") ", + "delete_cmd_prefix": "rm(", + "library": "var_list.r", + "varRefreshCmd": "cat(var_dic_list()) " + } + }, + "types_to_exclude": [ + "module", + "function", + "builtin_function_or_method", + "instance", + "_Feature" + ], + "window_display": false } }, "nbformat": 4, diff --git a/notebooks/WFC3/calwf3_v1.0_cte/example/background_median.py b/notebooks/WFC3/calwf3_v1.0_cte/example/background_median.py index 7e9c6a88d..3ae6d72f7 100644 --- a/notebooks/WFC3/calwf3_v1.0_cte/example/background_median.py +++ b/notebooks/WFC3/calwf3_v1.0_cte/example/background_median.py @@ -16,13 +16,13 @@ See the docstring of aperture_stats_tbl for more info. """ import numpy as np - # WAY faster than astropy.stats.sigma_clipped_stats from scipy.stats import sigmaclip from astropy.table import Table -def aperture_stats_tbl(data, apertures, - method='exact', sigma_clip=True): + +def aperture_stats_tbl(data, apertures, method='exact', sigma_clip=True): + """Computes mean/median/mode/std in Photutils apertures. Compute statistics for custom local background methods. This is primarily intended for estimating backgrounds @@ -63,23 +63,24 @@ def aperture_stats_tbl(data, apertures, aperture_stats = np.array(aperture_stats) - # Place the array of the x y positions alongside the stats stacked = np.hstack([apertures.positions, aperture_stats]) # Name the columns - names = ['X','Y','aperture_mean','aperture_median','aperture_mode', - 'aperture_std', 'aperture_area'] + names = ['X', 'Y', 'aperture_mean', 'aperture_median', + 'aperture_mode', 'aperture_std', 'aperture_area'] # Make the table stats_tbl = Table(data=stacked, names=names) - return stats_tbl + def calc_aperture_mmm(data, mask, sigma_clip): + """Helper function to actually calculate the stats for pixels falling within some Photutils aperture mask on some array of data. """ + cutout = mask.cutout(data, fill_value=np.nan) if cutout is None: return (np.nan, np.nan, np.nan, np.nan, np.nan) @@ -95,4 +96,5 @@ def calc_aperture_mmm(data, mask, sigma_clip): mode = 3 * median - 2 * mean actual_area = (~np.isnan(values)).sum() + return (mean, median, mode, std, actual_area) diff --git a/notebooks/WFC3/calwf3_v1.0_cte/pre-requirements.sh b/notebooks/WFC3/calwf3_v1.0_cte/pre-requirements.sh new file mode 100644 index 000000000..82294ac5e --- /dev/null +++ b/notebooks/WFC3/calwf3_v1.0_cte/pre-requirements.sh @@ -0,0 +1 @@ +conda install --yes -c http://ssb.stsci.edu/astroconda hstcal==2.5.0 diff --git a/notebooks/WFC3/calwf3_v1.0_cte/requirements.txt b/notebooks/WFC3/calwf3_v1.0_cte/requirements.txt new file mode 100644 index 000000000..71a33b71f --- /dev/null +++ b/notebooks/WFC3/calwf3_v1.0_cte/requirements.txt @@ -0,0 +1,9 @@ +astropy +astroquery +crds +jupyter +matplotlib +numpy +photutils +scipy +wfc3tools From 465c7e43de61d93805637db967067a2fe3f50161 Mon Sep 17 00:00:00 2001 From: Michael Dulude Date: Tue, 21 Nov 2023 14:17:44 -0500 Subject: [PATCH 05/30] Add WFC3 notebook 'wfc3_exception_report.ipynb' (#102) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit * updated _toc.yml and _config.yml files * Added trailing line to rad_prof.py. * Created notebook-level requirements.txt file. * wfc3_exception_report.ipynb: cleared all notebook outputs * Update rad_prof.py for PEP8 changes for PEP8 compliance * Update display_image.py for PEP8 edits for PEP8 compliance * Update display_image.py for PEP8 changes for PEP8 compiance * Update display_image.py for PEP8 🫠 * Update rad_prof.py for PEP8 final edits for PEP8 compliance * Update wfc3_exception_report.ipynb for PEP8 changes for PEP8 compliance * Update wfc3_exception_report.ipynb for PEP8 * Update wfc3_exception_report.ipynb for PEP8 hopefully final PEP8 changes. sorry im so bad at this 🫠 * Update requirements.txt Removing pins for version numbers. Also removing ginga dependency since I'm having problems with it in Python 3.11. Instead, I'm going to use astropy.visualization's ZScaleInterval * Update display_image.py swapping ginga.util.zscale for astropy.visualization's ZScaleInterval * Update wfc3_exception_report.ipynb removed `ginga.util.zscale` dependence and replaced with `astropy.visualization`'s `ZScaleInterval` * Update wfc3_exception_report.ipynb second markdown table in section 6.1 wasn't formatted correctly * Update README.md removing the old environment installation instructions and replacing them with information about using the `requirements.txt` file * Update README.md adding line breaks * Update wfc3_exception_report.ipynb minor changes to make the code more robust and platform-independent * Update wfc3_exception_report.ipynb i added shutil to the imports but forgot to describe it in the import markdown table --------- Co-authored-by: bjkuhn --- _config.yml | 1 - _toc.yml | 2 +- notebooks/WFC3/exception_report/README.md | 21 +- .../exception_report/docs/display_image.py | 273 ++++++++---------- .../WFC3/exception_report/docs/rad_prof.py | 51 ++-- .../WFC3/exception_report/requirements.txt | 7 + .../wfc3_exception_report.ipynb | 231 +++++++-------- 7 files changed, 285 insertions(+), 301 deletions(-) create mode 100644 notebooks/WFC3/exception_report/requirements.txt diff --git a/_config.yml b/_config.yml index 0ec637038..e7f606c78 100644 --- a/_config.yml +++ b/_config.yml @@ -46,7 +46,6 @@ exclude_patterns: [notebooks/DrizzlePac/align_mosaics/align_mosaics.ipynb, notebooks/DrizzlePac/use_ds9_regions_in_tweakreg/use_ds9_regions_in_tweakreg.ipynb, notebooks/WFC3/calwf3_recalibration/calwf3_recal_tvb.ipynb, notebooks/WFC3/dash/dash.ipynb, - notebooks/WFC3/exception_report/wfc3_exception_report.ipynb, notebooks/WFC3/filter_transformations/filter_transformations.ipynb, notebooks/WFC3/flux_conversion_tool/flux_conversion_tool.ipynb, notebooks/WFC3/image_displayer_analyzer/wfc3_image_displayer_analyzer.ipynb, diff --git a/_toc.yml b/_toc.yml index 3703fb2aa..94dbd254f 100644 --- a/_toc.yml +++ b/_toc.yml @@ -62,7 +62,7 @@ parts: # - file: notebooks/WFC3/calwf3_recalibration/calwf3_recal_tvb.ipynb - file: notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb # - file: notebooks/WFC3/dash/dash.ipynb -# - file: notebooks/WFC3/exception_report/wfc3_exception_report.ipynb + - file: notebooks/WFC3/exception_report/wfc3_exception_report.ipynb # - file: notebooks/WFC3/filter_transformations/filter_transformations.ipynb # - file: notebooks/WFC3/flux_conversion_tool/flux_conversion_tool.ipynb # - file: notebooks/WFC3/image_displayer_analyzer/wfc3_image_displayer_analyzer.ipynb diff --git a/notebooks/WFC3/exception_report/README.md b/notebooks/WFC3/exception_report/README.md index def41952f..47fd10a66 100644 --- a/notebooks/WFC3/exception_report/README.md +++ b/notebooks/WFC3/exception_report/README.md @@ -3,19 +3,22 @@ when an observer receives a WFC3 Exception Report Email. This directory, once downloaded, should contain this README.md, the tutorial Jupyter Notebook `wfc3_exception_report.ipynb`, an `html` copy of the notebook, -and a subdirectory titled `docs`. The subdirectory should contain two `.py` -files, one `.png`, and one `.gif` file that are used in the notebook. +a `requirements.txt` file, and a subdirectory titled `docs`. The subdirectory +should contain two `.py` files, one `.png`, and one `.gif` file that are used +in the notebook. -In order to run this Jupyter Notebook, you must have created a virtual -environment, such as the one in [WFC3 Library's](https://github.com/spacetelescope/WFC3Library) installation instructions. If -you are using the `wfc3_env` environment from the `wfc3_env_legacy.yml` file in the -WFC3Library repository, then you should not need any other packages to run this -notebook. +To run this Jupyter Notebook, you must have created a virtual environment +that contains (at minimum) the packages listed in the `requirements.txt` file +that is included within the repository. We recommend creating a new conda +environment using the requirements file: + `$ conda create -n except_report python=3.11`
+ `$ conda activate except_report`
+ `$ pip install -r requirements.txt`
+ Optional Note: The tools in this notebook (specifically display_image) look much better in Jupyter Lab rather than in the classic Jupyter Notebook. If your environment has Jupyter Lab installed it's recommended you use that to run the -.ipynb file. If you're interested in adding Jupyter Lab to your environment see -the install instructions on the [Jupyter website](https://jupyter.org/install). +.ipynb file. See the [Jupyter website](https://jupyter.org/install) for more info. Please submit any questions or comments to the [WFC3 Help Desk](https://stsci.service-now.com/hst). diff --git a/notebooks/WFC3/exception_report/docs/display_image.py b/notebooks/WFC3/exception_report/docs/display_image.py index c503a9ab2..b9f046808 100644 --- a/notebooks/WFC3/exception_report/docs/display_image.py +++ b/notebooks/WFC3/exception_report/docs/display_image.py @@ -1,23 +1,23 @@ #! /usr/bin/env python - -import numpy as np import sys from astropy.io import fits -from ginga.util import zscale +from astropy.visualization import ZScaleInterval import matplotlib.pyplot as plt +import numpy as np + def display_image(filename, - colormaps=['Greys_r','Greys_r','inferno_r'], - scaling=[(None,None),(None,None),(None,None)], + colormaps=['Greys_r', 'Greys_r', 'inferno_r'], + scaling=[(None, None), (None, None), (None, None)], printmeta=False, ima_multiread=False, - figsize=(18,18), + figsize=(18, 18), dpi=200): - - """ A function to display the 'SCI', 'ERR/WHT', and 'DQ/CTX' arrays - of any WFC3 fits image. This function returns nothing, but will display - the requested image on the screen when called. + """ + A function to display the 'SCI', 'ERR/WHT', and 'DQ/CTX' arrays + of any WFC3 fits image. This function returns nothing, but will display + the requested image on the screen when called. Authors ------- @@ -51,8 +51,8 @@ def display_image(filename, List of real numbers to act as scalings for the SCI, ERR, and DQ arrays. The first element in the list is for the SCI array the second is for the ERR array and the third element in the list is for the DQ extension. If - no scalings are given the default scaling will use - ginga.util.zscale.zscale(). All three scalings must be provided even if + no scalings are given the default scaling will use astropy.visualization + ZScaleInterval.get_limits(). All three scalings must be provided even if only changing 1-2 scalings. E.g. to change SCI array scaling: scaling = [(5E4,8E4),(None,None),(None,None)] @@ -108,7 +108,7 @@ def display_image(filename, print("Invalid image section specified") return 0, 0 try: - xstart = int(xsec[: xs]) + xstart = int(xsec[:xs]) except ValueError: print("Problem getting xstart") return @@ -132,7 +132,6 @@ def display_image(filename, print("Problem getting yend") return - bunit = get_bunit(h1) detector = h['detector'] issubarray = h['subarray'] si = h['primesi'] @@ -151,33 +150,32 @@ def display_image(filename, print('-'*44) print(f"Filter = {h['filter']}, Date-Obs = {h['date-obs']} T{h['time-obs']},\nTarget = {h['targname']}, Exptime = {h['exptime']}, Subarray = {issubarray}, Units = {h1['bunit']}\n") - if detector == 'UVIS': - if ima_multiread == True: + if ima_multiread is True: sys.exit("keyword argument 'ima_multiread' can only be set to True for 'ima.fits' files") try: if all_pixels: xstart = 0 ystart = 0 - xend = naxis1 # full x size + xend = naxis1 # full x size yend = naxis2*2 # full y size with fits.open(imagename) as hdu: - uvis2_sci = hdu["SCI",1].data + uvis2_sci = hdu["SCI", 1].data uvis2_err = hdu[2].data uvis2_dq = hdu[3].data - uvis1_sci = hdu["SCI",2].data + uvis1_sci = hdu["SCI", 2].data uvis1_err = hdu[5].data uvis1_dq = hdu[6].data try: - fullsci = np.concatenate([uvis2_sci,uvis1_sci]) - fulldq = np.concatenate([uvis2_dq,uvis1_dq]) - fullerr = np.concatenate([uvis2_err,uvis1_err]) + fullsci = np.concatenate([uvis2_sci, uvis1_sci]) + fulldq = np.concatenate([uvis2_dq, uvis1_dq]) + fullerr = np.concatenate([uvis2_err, uvis1_err]) - fullsci = fullsci[ystart:yend,xstart:xend] - fulldq = fulldq[ystart:yend,xstart:xend] - fullerr = fullerr[ystart:yend,xstart:xend] + fullsci = fullsci[ystart:yend, xstart:xend] + fulldq = fulldq[ystart:yend, xstart:xend] + fullerr = fullerr[ystart:yend, xstart:xend] make1x3plot(scaling, colormaps, fullsci, fullerr, fulldq, xstart, xend, ystart, yend, @@ -185,26 +183,26 @@ def display_image(filename, figsize, dpi) except ValueError: - fullsci = np.concatenate([uvis2_sci,uvis1_sci]) - fullsci = fullsci[ystart:yend,xstart:xend] + fullsci = np.concatenate([uvis2_sci, uvis1_sci]) + fullsci = fullsci[ystart:yend, xstart:xend] - z1_sci, z2_sci = get_scale_limits(scaling[0],fullsci,'SCI') + z1_sci, z2_sci = get_scale_limits(scaling[0], fullsci, 'SCI') - fig, ax1 = plt.subplots(1,1,figsize=figsize,dpi=dpi) - im1 = ax1.imshow(fullsci,origin='lower',extent=(xstart,xend,ystart,yend),cmap=colormaps[0],vmin=z1_sci, vmax=z2_sci) + fig, ax1 = plt.subplots(1, 1, figsize=figsize, dpi=dpi) + im1 = ax1.imshow(fullsci, origin='lower', extent=(xstart, xend, ystart, yend), cmap=colormaps[0], vmin=z1_sci, vmax=z2_sci) if len(fname) > 18: ax1.set_title(f"WFC3/{detector} {fname}\n{h1['extname']} ext") else: ax1.set_title(f"WFC3/{detector} {fname} {h1['extname']} ext") - fig.colorbar(im1, ax=ax1,shrink=.75,pad=.03) + fig.colorbar(im1, ax=ax1, shrink=.75, pad=.03) - except (IndexError,KeyError): + except (IndexError, KeyError): if all_pixels: - xstart = 0 - ystart = 0 - xend = naxis1 # full x size - yend = naxis2 # full y size + xstart = 0 + ystart = 0 + xend = naxis1 # full x size + yend = naxis2 # full y size with fits.open(imagename) as hdu: uvis_ext1 = hdu[1].data @@ -212,35 +210,34 @@ def display_image(filename, uvis_ext3 = hdu[3].data try: - uvis_ext1 = uvis_ext1[ystart:yend,xstart:xend] - uvis_ext2 = uvis_ext2[ystart:yend,xstart:xend] - uvis_ext3 = uvis_ext3[ystart:yend,xstart:xend] + uvis_ext1 = uvis_ext1[ystart:yend, xstart:xend] + uvis_ext2 = uvis_ext2[ystart:yend, xstart:xend] + uvis_ext3 = uvis_ext3[ystart:yend, xstart:xend] make1x3plot(scaling, colormaps, uvis_ext1, uvis_ext2, uvis_ext3, xstart, xend, ystart, yend, detector, fname, h1, h2, h3, figsize, dpi) - except (TypeError,IndexError,AttributeError): + except (TypeError, IndexError, AttributeError): - z1_sci, z2_sci = get_scale_limits(scaling[0],uvis_ext1,'SCI') - fig, ax1 = plt.subplots(1,1,figsize=figsize,dpi=dpi) - im1 = ax1.imshow(uvis_ext1,origin='lower',extent=(xstart,xend,ystart,yend),cmap=colormaps[0],vmin=z1_sci, vmax=z2_sci) + z1_sci, z2_sci = get_scale_limits(scaling[0], uvis_ext1, 'SCI') + fig, ax1 = plt.subplots(1, 1, figsize=figsize, dpi=dpi) + im1 = ax1.imshow(uvis_ext1, origin='lower', extent=(xstart, xend, ystart, yend), cmap=colormaps[0], vmin=z1_sci, vmax=z2_sci) if len(fname) > 18: ax1.set_title(f"WFC3/{detector} {fname}\n{h1['extname']} ext") else: ax1.set_title(f"WFC3/{detector} {fname} {h1['extname']} ext") - fig.colorbar(im1, ax=ax1,shrink=.75,pad=.03) - + fig.colorbar(im1, ax=ax1, shrink=.75, pad=.03) if detector == 'IR' and '_ima.fits' not in fname: - if ima_multiread == True: + if ima_multiread is True: sys.exit("keyword argument 'ima_multiread' can only be set to True for 'ima.fits' files") if all_pixels: xstart = 0 ystart = 0 - xend = naxis1 # full x size - yend = naxis2 # full y size + xend = naxis1 # full x size + yend = naxis2 # full y size try: with fits.open(imagename) as hdu: @@ -248,9 +245,9 @@ def display_image(filename, data_err = hdu[2].data data_dq = hdu[3].data - data_sci = data_sci[ystart:yend,xstart:xend] - data_err = data_err[ystart:yend,xstart:xend] - data_dq = data_dq[ystart:yend,xstart:xend] + data_sci = data_sci[ystart:yend, xstart:xend] + data_err = data_err[ystart:yend, xstart:xend] + data_dq = data_dq[ystart:yend, xstart:xend] make1x3plot(scaling, colormaps, data_sci, data_err, data_dq, xstart, xend, ystart, yend, @@ -258,49 +255,48 @@ def display_image(filename, figsize, dpi) except (AttributeError, TypeError, ValueError): - z1_sci, z2_sci = get_scale_limits(scaling[0],data_sci,'SCI') - fig, ax1 = plt.subplots(1,1,figsize=figsize,dpi=dpi) - im1 = ax1.imshow(data_sci,origin='lower',extent=(xstart,xend,ystart,yend),cmap=colormaps[0],vmin=z1_sci, vmax=z2_sci) - if len(fname) > 18: - ax1.set_title(f"WFC3/{detector} {fname}\n{h1['extname']} ext") - else: - ax1.set_title(f"WFC3/{detector} {fname} {h1['extname']} ext") - fig.colorbar(im1, ax=ax1,shrink=.75,pad=.03) - + z1_sci, z2_sci = get_scale_limits(scaling[0], data_sci, 'SCI') + fig, ax1 = plt.subplots(1, 1, figsize=figsize, dpi=dpi) + im1 = ax1.imshow(data_sci, origin='lower', extent=(xstart, xend, ystart, yend), cmap=colormaps[0], vmin=z1_sci, vmax=z2_sci) + if len(fname) > 18: + ax1.set_title(f"WFC3/{detector} {fname}\n{h1['extname']} ext") + else: + ax1.set_title(f"WFC3/{detector} {fname} {h1['extname']} ext") + fig.colorbar(im1, ax=ax1, shrink=.75, pad=.03) if '_ima.fits' in fname: if all_pixels: xstart = 0 ystart = 0 - xend = naxis1 # full x size - yend = naxis2 # full y size + xend = naxis1 # full x size + yend = naxis2 # full y size - if ima_multiread == True: + if ima_multiread is True: nsamps = h['NSAMP'] - for ext in reversed(range(1,nsamps+1)): + for ext in reversed(range(1, nsamps+1)): with fits.open(imagename) as hdu: - data_sci = hdu['SCI',ext].data - data_err = hdu['ERR',ext].data - data_dq = hdu['DQ',ext].data + data_sci = hdu['SCI', ext].data + data_err = hdu['ERR', ext].data + data_dq = hdu['DQ', ext].data - data_sci = data_sci[ystart:yend,xstart:xend] - data_err = data_err[ystart:yend,xstart:xend] - data_dq = data_dq[ystart:yend,xstart:xend] + data_sci = data_sci[ystart:yend, xstart:xend] + data_err = data_err[ystart:yend, xstart:xend] + data_dq = data_dq[ystart:yend, xstart:xend] makeIR1x3plot(scaling, colormaps, data_sci, data_err, data_dq, - xstart, xend, ystart, yend, - detector, fname, h1, h2, h3, nsamps, ext, - figsize, dpi) + xstart, xend, ystart, yend, + detector, fname, h1, h2, h3, nsamps, ext, + figsize, dpi) - if ima_multiread == False: + if ima_multiread is False: with fits.open(imagename) as hdu: - data_sci = hdu['SCI',1].data - data_err = hdu['ERR',1].data - data_dq = hdu['DQ',1].data + data_sci = hdu['SCI', 1].data + data_err = hdu['ERR', 1].data + data_dq = hdu['DQ', 1].data - data_sci = data_sci[ystart:yend,xstart:xend] - data_err = data_err[ystart:yend,xstart:xend] - data_dq = data_dq[ystart:yend,xstart:xend] + data_sci = data_sci[ystart:yend, xstart:xend] + data_err = data_err[ystart:yend, xstart:xend] + data_dq = data_dq[ystart:yend, xstart:xend] make1x3plot(scaling, colormaps, data_sci, data_err, data_dq, xstart, xend, ystart, yend, @@ -308,37 +304,9 @@ def display_image(filename, figsize, dpi) -def get_bunit(ext1header): - """ Get the brightness unit for the plot axis label. - - Parameters - ---------- - ext1header: Header - The extension 1 header of the fits file being displayed. This is the - extension that contains the brightness unit keyword. - - Returns - ------- - The string of the brightness unit for the axis label - {'counts', 'counts/s','e$^-$', 'e$^-$/s'} - - """ - units = ext1header['bunit'] - - if units == 'COUNTS': - return 'counts' - elif units == 'COUNTS/S': - return 'counts/s' - elif units == 'ELECTRONS': - return 'e$^-$' - elif units == 'ELECTRONS/S': - return 'e$^-$/s' - else: - return units - - def get_scale_limits(scaling, array, extname): - """ Get the scale limits to use for the image extension being displayed. + """ + Get the scale limits to use for the image extension being displayed. Parameters ---------- @@ -346,8 +314,8 @@ def get_scale_limits(scaling, array, extname): List of real numbers to act as scalings for the SCI, ERR, and DQ arrays. The first element in the list is for the SCI array the second is for the ERR array and the third element in the list is for the DQ extension. If - no scalings are given the default scaling will use - ginga.util.zscale.zscale(). All three scalings must be provided even if + no scalings are given the default scaling will use astropy.visualization + ZScaleInterval.get_limits(). All three scalings must be provided even if only changing 1-2 scalings. E.g. to change SCI array scaling: scaling = [(5E4,8E4),(None,None),(None,None)] @@ -366,29 +334,31 @@ def get_scale_limits(scaling, array, extname): The maximum value for the image scale. """ + + z = ZScaleInterval() if extname == 'DQ': - if scaling[0] == None and scaling[1] == None: + if scaling[0] is None and scaling[1] is None: z1, z2 = array.min(), array.max() - elif scaling[0] == None and scaling[1] != None: + elif scaling[0] is None and scaling[1] is not None: z1 = array.min() z2 = scaling[1] - elif scaling[0] != None and scaling[1] == None: + elif scaling[0] is not None and scaling[1] is None: z1 = scaling[0] z2 = array.max() - elif scaling[0] != None and scaling[1] != None: + elif scaling[0] is not None and scaling[1] is not None: z1 = scaling[0] z2 = scaling[1] - + elif extname == 'SCI' or extname == 'ERR': - if scaling[0] == None and scaling[1] == None: - z1, z2 = zscale.zscale(array) - elif scaling[0] == None and scaling[1] != None: - z1 = zscale.zscale(array)[0] + if scaling[0] is None and scaling[1] is None: + z1, z2 = z.get_limits(array) + elif scaling[0] is None and scaling[1] is not None: + z1 = z.get_limits(array)[0] z2 = scaling[1] - elif scaling[0] != None and scaling[1] == None: + elif scaling[0] is not None and scaling[1] is None: z1 = scaling[0] - z2 = zscale.zscale(array)[1] - elif scaling[0] != None and scaling[1] != None: + z2 = z.get_limits(array)[1] + elif scaling[0] is not None and scaling[1] is not None: z1 = scaling[0] z2 = scaling[1] else: @@ -401,7 +371,7 @@ def get_scale_limits(scaling, array, extname): def make1x3plot(scaling, colormaps, fullsci, fullerr, fulldq, xstart, xend, ystart, yend, detector, fname, h1, h2, h3, - figsize=(9,6), dpi=100): + figsize=(9, 6), dpi=100): """ Make a 3 column figure to display any WFC3 image or image section. Parameters @@ -410,8 +380,8 @@ def make1x3plot(scaling, colormaps, fullsci, fullerr, fulldq, List of real numbers to act as scalings for the SCI, ERR, and DQ arrays. The first element in the list is for the SCI array the second is for the ERR array and the third element in the list is for the DQ extension. If - no scalings are given the default scaling will use - ginga.util.zscale.zscale(). All three scalings must be provided even if + no scalings are given the default scaling will use astropy.visualization + ZScaleInterval.get_limits(). All three scalings must be provided even if only changing 1-2 scalings. E.g. to change SCI array scaling: scaling = [(5E4,8E4),(None,None),(None,None)] @@ -477,15 +447,15 @@ def make1x3plot(scaling, colormaps, fullsci, fullerr, fulldq, """ - z1_sci, z2_sci = get_scale_limits(scaling[0],fullsci,'SCI') - z1_err, z2_err = get_scale_limits(scaling[1],fullerr,'ERR') - z1_dq, z2_dq = get_scale_limits(scaling[2],fulldq,'DQ') + z1_sci, z2_sci = get_scale_limits(scaling[0], fullsci, 'SCI') + z1_err, z2_err = get_scale_limits(scaling[1], fullerr, 'ERR') + z1_dq, z2_dq = get_scale_limits(scaling[2], fulldq, 'DQ') - fig, [ax1,ax2,ax3] = plt.subplots(1,3,figsize=figsize,dpi=dpi) + fig, [ax1, ax2, ax3] = plt.subplots(1, 3, figsize=figsize, dpi=dpi) - im1 = ax1.imshow(fullsci,origin='lower',extent=(xstart,xend,ystart,yend),cmap=colormaps[0],vmin=z1_sci, vmax=z2_sci) - im2 = ax2.imshow(fullerr,origin='lower',extent=(xstart,xend,ystart,yend),cmap=colormaps[1],vmin=z1_err, vmax=z2_err) - im3 = ax3.imshow(fulldq, origin='lower',extent=(xstart,xend,ystart,yend),cmap=colormaps[2],vmin=z1_dq, vmax=z2_dq) + im1 = ax1.imshow(fullsci, origin='lower', extent=(xstart, xend, ystart, yend), cmap=colormaps[0], vmin=z1_sci, vmax=z2_sci) + im2 = ax2.imshow(fullerr, origin='lower', extent=(xstart, xend, ystart, yend), cmap=colormaps[1], vmin=z1_err, vmax=z2_err) + im3 = ax3.imshow(fulldq, origin='lower', extent=(xstart, xend, ystart, yend), cmap=colormaps[2], vmin=z1_dq, vmax=z2_dq) if len(fname) > 18: ax1.set_title(f"WFC3/{detector} {fname}\n{h1['extname']} ext") @@ -495,14 +465,15 @@ def make1x3plot(scaling, colormaps, fullsci, fullerr, fulldq, ax1.set_title(f"WFC3/{detector} {fname} {h1['extname']} ext") ax2.set_title(f"WFC3/{detector} {fname} {h2['extname']} ext") ax3.set_title(f"WFC3/{detector} {fname} {h3['extname']} ext") - fig.colorbar(im1, ax=ax1,shrink=.25,pad=.03) - fig.colorbar(im2, ax=ax2,shrink=.25,pad=.03) - fig.colorbar(im3, ax=ax3,shrink=.25,pad=.03) + fig.colorbar(im1, ax=ax1, shrink=.25, pad=.03) + fig.colorbar(im2, ax=ax2, shrink=.25, pad=.03) + fig.colorbar(im3, ax=ax3, shrink=.25, pad=.03) + def makeIR1x3plot(scaling, colormaps, data_sci, data_err, data_dq, xstart, xend, ystart, yend, detector, fname, h1, h2, h3, nsamps, ext, - figsize=(9,6), dpi=100): + figsize=(9, 6), dpi=100): """ Make a 3 column figure to display any WFC3 IMA image or image section. Parameters @@ -511,8 +482,8 @@ def makeIR1x3plot(scaling, colormaps, data_sci, data_err, data_dq, List of real numbers to act as scalings for the SCI, ERR, and DQ arrays. The first element in the list is for the SCI array the second is for the ERR array and the third element in the list is for the DQ extension. If - no scalings are given the default scaling will use - ginga.util.zscale.zscale(). All three scalings must be provided even if + no scalings are given the default scaling will use astropy.visualization + ZScaleInterval.get_limits(). All three scalings must be provided even if only changing 1-2 scalings. E.g. to change SCI array scaling: scaling = [(5E4,8E4),(None,None),(None,None)] @@ -584,17 +555,17 @@ def makeIR1x3plot(scaling, colormaps, data_sci, data_err, data_dq, """ - z1_sci, z2_sci = get_scale_limits(scaling[0],data_sci,'SCI') - z1_err, z2_err = get_scale_limits(scaling[1],data_err,'ERR') - z1_dq, z2_dq = get_scale_limits(scaling[2],data_dq,'DQ') - - fig, [ax1,ax2,ax3] = plt.subplots(1,3,figsize = figsize,dpi=dpi) - im1 = ax1.imshow(data_sci,origin='lower',extent=(xstart,xend,ystart,yend),cmap=colormaps[0],vmin=z1_sci, vmax=z2_sci) - im2 = ax2.imshow(data_err,origin='lower',extent=(xstart,xend,ystart,yend),cmap=colormaps[1],vmin=z1_err, vmax=z2_err) - im3 = ax3.imshow(data_dq, origin='lower',extent=(xstart,xend,ystart,yend),cmap=colormaps[2],vmin=z1_dq, vmax=z2_dq) - fig.colorbar(im1, ax=ax1,shrink=.25,pad=.03) - fig.colorbar(im2, ax=ax2,shrink=.25,pad=.03) - fig.colorbar(im3, ax=ax3,shrink=.25,pad=.03) + z1_sci, z2_sci = get_scale_limits(scaling[0], data_sci, 'SCI') + z1_err, z2_err = get_scale_limits(scaling[1], data_err, 'ERR') + z1_dq, z2_dq = get_scale_limits(scaling[2], data_dq, 'DQ') + + fig, [ax1, ax2, ax3] = plt.subplots(1, 3, figsize=figsize, dpi=dpi) + im1 = ax1.imshow(data_sci, origin='lower', extent=(xstart, xend, ystart, yend), cmap=colormaps[0], vmin=z1_sci, vmax=z2_sci) + im2 = ax2.imshow(data_err, origin='lower', extent=(xstart, xend, ystart, yend), cmap=colormaps[1], vmin=z1_err, vmax=z2_err) + im3 = ax3.imshow(data_dq, origin='lower', extent=(xstart, xend, ystart, yend), cmap=colormaps[2], vmin=z1_dq, vmax=z2_dq) + fig.colorbar(im1, ax=ax1, shrink=.25, pad=.03) + fig.colorbar(im2, ax=ax2, shrink=.25, pad=.03) + fig.colorbar(im3, ax=ax3, shrink=.25, pad=.03) if len(fname) > 18: ax1.set_title(f"WFC3/{detector} {fname}\n {h1['extname']} read {(nsamps+1)-ext}") diff --git a/notebooks/WFC3/exception_report/docs/rad_prof.py b/notebooks/WFC3/exception_report/docs/rad_prof.py index 24a5aaa05..970931b5a 100644 --- a/notebooks/WFC3/exception_report/docs/rad_prof.py +++ b/notebooks/WFC3/exception_report/docs/rad_prof.py @@ -40,10 +40,11 @@ import numpy as np import matplotlib.pyplot as plt -from photutils.centroids import centroid_com, centroid_1dg, centroid_2dg +from photutils.centroids import centroid_2dg from photutils.aperture import CircularAperture from scipy.optimize import curve_fit + class RadialProfile: """Main function to calulate radial profiles. @@ -139,29 +140,29 @@ def __init__(self, x, y, data, r=5, fit=True, recenter=False, self.fit_profile() # performs fit, updates self.fitted if show: self.show_profile(ax) - + def _create_profile(self): """Compute distances to pixels in cutout""" iY, iX = np.mgrid[self.sy, self.sx] # Pixel grid indices # extent = [sx.start, sx.stop-1, sy.start, sy.stop-1] - self.distances = np.sqrt( (iX - self.x) ** 2. - + (iY - self.y) ** 2. ).flatten() + self.distances = np.sqrt((iX - self.x) ** 2. + + (iY - self.y) ** 2.).flatten() self.values = self.cutout.flatten() - + def _setup_cutout(self, data): """Cuts out the aperture and defines slice objects. General setup procedure. """ self.ap = CircularAperture((self.x, self.y), r=self.r) mask = self.ap.to_mask() - self.sy = slice(mask.bbox.iymin,mask.bbox.iymax,None) - self.sx = slice(mask.bbox.ixmin,mask.bbox.ixmax,None) + self.sy = slice(mask.bbox.iymin, mask.bbox.iymax, None) + self.sx = slice(mask.bbox.ixmin, mask.bbox.ixmax, None) self.cutout = mask.cutout(data, fill_value=np.nan) if self.cutout is None: self.is_empty = True - + def fit_profile(self): """Fits 1d Moffat function to measured radial profile. Fits a moffat profile to the distance and values of the pixels. @@ -171,11 +172,11 @@ def fit_profile(self): amp0 = np.amax(self.values) bias0 = np.nanmedian(self.values) best_vals, covar = curve_fit(RadialProfile.profile_model, - self.distances, - self.values, - p0 = [amp0, 1.5, 1.5, bias0], - bounds = ([0., .3, .5, 0], - [np.inf, 10., 10., np.inf])) + self.distances, + self.values, + p0=[amp0, 1.5, 1.5, bias0], + bounds=([0., .3, .5, 0], + [np.inf, 10., 10., np.inf])) hwhm = best_vals[1] * np.sqrt(2. ** (1./best_vals[2]) - 1.) self.fwhm = 2 * hwhm self.amp, self.gamma, self.alpha, self.bias = best_vals @@ -188,7 +189,7 @@ def fit_profile(self): self.fwhm = np.nan self.fitted = False self.chisquared = np.nan - + @staticmethod def profile_model(r, amp, gamma, alpha, bias): """Returns 1D Moffat profile evaluated at r values. @@ -219,7 +220,7 @@ def profile_model(r, amp, gamma, alpha, bias): """ model = amp * (1. + (r / gamma) ** 2.) ** (-1. * alpha) + bias return model - + def recenter_source(self, data): """Recenters source position in cutout and updates x,y attributes""" @@ -244,7 +245,7 @@ def recenter_source(self, data): self.x = xg1 + self.sx.start self.y = yg1 + self.sy.start self._setup_cutout(data) - + def show_profile(self, ax=None, show_fit=True): """Makes plot of radial profile. @@ -274,24 +275,22 @@ def show_profile(self, ax=None, show_fit=True): fig = plt.figure(dpi=110) ax = fig.add_subplot(111) - ax.scatter(self.distances, self.values, alpha=.5,s=3) - min_y = np.amin(self.values[self.values >0.])/2. - #ax.set_ylim(min_y, np.nanmax(self.values)*2.) + ax.scatter(self.distances, self.values, alpha=.5, s=3) ax.set_ylim(0.1, np.nanmax(self.values)*2.) ax.set_xlim(0.) ax.set_yscale('log') - ax.set_ylabel('Pixel Value',size=13) - ax.set_xlabel('Distance from centroid [pix]',size=13) + ax.set_ylabel('Pixel Value', size=13) + ax.set_xlabel('Distance from centroid [pix]', size=13) if self.fitted and show_fit: - tmp_r = np.arange(0,np.ceil(np.amax(self.distances)),.1) + tmp_r = np.arange(0, np.ceil(np.amax(self.distances)), .1) model_fit = RadialProfile.profile_model(tmp_r, self.amp, self.gamma, self.alpha, self.bias) - label = r'$\gamma$= {}, $\alpha$ = {}'.format(round(self.gamma,2), - round(self.alpha,2)) + label = r'$\gamma$= {}, $\alpha$ = {}'.format(round(self.gamma, 2), + round(self.alpha, 2)) label += '\nFWHM = {}'.format(round(self.fwhm, 2)) - ax.plot(tmp_r, model_fit, label=label,color='k') - ax.legend(loc=1,prop={'size':13}) + ax.plot(tmp_r, model_fit, label=label, color='k') + ax.legend(loc=1, prop={'size': 13}) return ax diff --git a/notebooks/WFC3/exception_report/requirements.txt b/notebooks/WFC3/exception_report/requirements.txt new file mode 100644 index 000000000..26d1dafa3 --- /dev/null +++ b/notebooks/WFC3/exception_report/requirements.txt @@ -0,0 +1,7 @@ +astropy +astroquery +jupyter +matplotlib +numpy +photutils +scipy diff --git a/notebooks/WFC3/exception_report/wfc3_exception_report.ipynb b/notebooks/WFC3/exception_report/wfc3_exception_report.ipynb index 56efe0fd5..c1089bb4a 100755 --- a/notebooks/WFC3/exception_report/wfc3_exception_report.ipynb +++ b/notebooks/WFC3/exception_report/wfc3_exception_report.ipynb @@ -116,18 +116,22 @@ "source": [ "## 1. Imports \n", "\n", - "Installation instructions for this notebook are in a `README.md` attached to the repository.
\n", + "
This notebook assumes you have created and activated a virtual environment using the requirements file in this notebook's repository.
\n", + "\n", "Please make sure you have read the contents of the `README.md` before continuing the notebook\n", " \n", - "We import: \n", + "We import:
\n", "\n", - "- *glob* to make lists of files\n", - "- *os* to name files and remove directories \n", - "- *astropy.io.fits* for accessing FITS files\n", - "- *astropy.table Table* for creating tidy tables of the data\n", - "- *astroquery.mast.Observations* for downloading data from MAST\n", - "- *matplotlib.pyplot* for plotting data\n", - "- *display_image* for displaying any type of WFC3 image" + "| Package Name | Purpose |\n", + "|:-------------------------------|:--------------------------------------|\n", + "| `glob` | creating list of files |\n", + "| `os` | setting environment variables |\n", + "| `shutil` | direcotry clean up |\n", + "| `astropy.io.fits` | opening and modifying fits files |\n", + "| `astroquery.mast.Observations` | downloading data from MAST |\n", + "| `astropy.table.Table` | creating and manipulating data tables |\n", + "| `matplotlib.pyplot` | plotting and displaying images |\n", + "| `docs.display_image` | for displaying any type of WFC3 image |" ] }, { @@ -143,13 +147,14 @@ "%matplotlib inline\n", "import glob \n", "import os\n", + "import shutil\n", "\n", "from astropy.io import fits\n", "from astropy.table import Table\n", "from astroquery.mast import Observations\n", "import matplotlib.pyplot as plt\n", "\n", - "from docs.display_image import display_image\n" + "from docs.display_image import display_image" ] }, { @@ -184,22 +189,23 @@ "metadata": {}, "outputs": [], "source": [ - "exp_ids = ['IEPP01010'] # Edit with exposure ID(s)\n", + "# Edit with exposure ID(s)\n", + "exp_ids = ['IEPP01010'] \n", "\n", "# Specify flle types to download\n", - "file_types = ['FLT', 'JIF','JIT']\n", + "file_types = ['FLT', 'JIF', 'JIT']\n", "\n", - "#loop through exposure id\n", + "# Loop through exposure id\n", "for obsid in exp_ids: \n", " # make new directory to hold fits files - named by exposure id\n", + " newdir = os.path.join(os.getcwd(), obsid.lower())\n", " try:\n", - " newdir = os.getcwd()+'/'+ obsid.lower()+'/'\n", " mkdir = os.mkdir(newdir)\n", " print(f'Making new directory {newdir}')\n", " except FileExistsError: \n", - " pass\n", + " print(f'Directory {newdir} already exists.')\n", " \n", - " # loop through to get FLTs, JIFs, and JITs\n", + " # Loop through to get FLTs, JIFs, and JITs\n", " for file_type in file_types:\n", " print(f'Working on getting {file_type} files for Exposure ID {obsid}')\n", " obs_table = Observations.query_criteria(obs_id=obsid.lower())\n", @@ -209,22 +215,25 @@ " \n", " # For convenience move raws to cwd and remove empty download dir\n", " for file in download_table['Local Path']:\n", - " filename = file.split('/')[-1]\n", - " print(f'Moving {file} to {newdir+filename}')\n", - " os.rename(file, newdir+filename)\n", + " filename = os.path.basename(file)\n", + " new_file_path = os.path.join(newdir, filename)\n", + " print(f'Moving {file} to {new_file_path}')\n", + " os.rename(file, new_file_path)\n", + " remove_dir = os.path.join('mastDownload', 'HST', filename[:9]) \n", + "\n", " try:\n", - " os.rmdir(f'mastDownload/HST/{filename[:9]}')\n", - " print(f'Removing mastDownload/HST/{filename[:9]}') \n", - " os.rmdir(f'mastDownload/HST/{obsid.lower()}')\n", - " print(f'Remvoing mastDownload/HST/{obsid.lower()}') \n", + " os.rmdir(remove_dir)\n", + " print(f'Removing {remove_dir}')\n", " except (OSError, FileNotFoundError): \n", - " pass\n", - " \n", - " print(f'Remvoing mastDownload/HST/') \n", - " os.rmdir(f'mastDownload/HST/')\n", - " print(f'Remvoing mastDownload/')\n", - " os.rmdir('mastDownload/')\n", - " " + " print(f'Error removing directory {remove_dir}')\n", + "\n", + " mast_dir = 'mastDownload'\n", + " # Check and remove mastDownload directory\n", + " if os.path.exists(mast_dir):\n", + " print(f'Removing {mast_dir} directory')\n", + " shutil.rmtree(mast_dir)\n", + " else:\n", + " print(f'{mast_dir} does not exist')" ] }, { @@ -256,16 +265,6 @@ "recalibration. " ] }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# only run this cell if you would like to see the docscring for the function\n", - "display_image?" - ] - }, { "cell_type": "code", "execution_count": null, @@ -279,12 +278,12 @@ "\n", "for f in fltfiles:\n", " display_image(f,\n", - " colormaps=['Greys_r', 'Greys_r', 'inferno_r'],\n", - " scaling=[(-10, 130), (None, None), (None, None)],\n", - " printmeta=True,\n", - " ima_multiread=False,\n", - " figsize=(16, 16),\n", - " dpi=150)" + " colormaps=['Greys_r', 'Greys_r', 'inferno_r'],\n", + " scaling=[(-10, 130), (None, None), (None, None)],\n", + " printmeta=True,\n", + " ima_multiread=False,\n", + " figsize=(16, 16),\n", + " dpi=150)" ] }, { @@ -315,8 +314,9 @@ "metadata": {}, "outputs": [], "source": [ - "jif_file = f'{exp_ids[0].lower()}/{exp_ids[0].lower()}_jif.fits' # Edit with the path to your own _jif.fits file\n", - "fits.getheader(jif_file,0)[-21:-6]" + "# Edit with the path to your own _jif.fits file\n", + "jif_file = f'{exp_ids[0].lower()}/{exp_ids[0].lower()}_jif.fits' \n", + "fits.getheader(jif_file, 0)[-21:-6]" ] }, { @@ -333,7 +333,7 @@ "metadata": {}, "outputs": [], "source": [ - "fits.getheader(jif_file,0)['T_GSFAIL*']" + "fits.getheader(jif_file, 0)['T_GSFAIL*']" ] }, { @@ -350,26 +350,26 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "tags": [] - }, + "metadata": {}, "outputs": [], "source": [ - "numexts = fits.getheader(jif_file,0)['NEXTEND'] # number of extensions i.e. exposures\n", + "numexts = fits.getheader(jif_file, 0)['NEXTEND'] # number of extensions i.e. exposures\n", "\n", - "keywords = ['EXPNAME*','GUIDEACT*','GSACQ*','ACTGSSEP*',\n", - " 'GSSEPRMS*','NLOSSES*','CRVAL1*','CRVAL2*',\n", - " 'V2_RMS*','V3_RMS*','GSFAIL*']\n", + "keywords = ['EXPNAME*', 'GUIDEACT*', 'GSACQ*', 'ACTGSSEP*',\n", + " 'GSSEPRMS*', ' NLOSSES*', 'CRVAL1*', 'CRVAL2*',\n", + " 'V2_RMS*', 'V3_RMS*', 'GSFAIL*']\n", "\n", - "for ext in range(1,numexts+1):\n", - " print(\"JIF Header Ext Number:\",ext)\n", + "for ext in range(1, numexts+1):\n", + " print(\"JIF Header Ext Number:\", ext)\n", " print('-'*80)\n", + " header = fits.getheader(jif_file, ext)\n", " for keyword in keywords:\n", " # try to display keyword because it may not be present \n", - " try: \n", - " print(fits.getheader(jif_file,ext)[keyword])\n", - " except KeyError: \n", - " pass\n", + " details = header.get(keyword)\n", + " if details is not None:\n", + " print(f'{details}')\n", + " else:\n", + " print(f'Keyword {keyword} not found in extension {ext}')\n", " print('\\n')" ] }, @@ -410,29 +410,28 @@ "outputs": [], "source": [ "jit_file = f'{exp_ids[0].lower()}/{exp_ids[0].lower()}_jit.fits' # Edit with the path to your own _jit.fits file\n", - "numexts = fits.getheader(jit_file,0)['NEXTEND'] # number of extensions i.e. exposures\n", + "numexts = fits.getheader(jit_file, 0)['NEXTEND'] # number of extensions i.e. exposures\n", "\n", - "figure_size = (7,5) # Edit if you want to chage the figure size\n", + "figure_size = (7, 5) # Edit if you want to chage the figure size\n", "dotsperinch = 115 # Edit if you want to change the figure resolution\n", "\n", - "for ext in range(1,numexts+1):\n", + "for ext in range(1, numexts+1):\n", "\n", - " jit_tbl = Table(fits.getdata(jit_file,ext))\n", - " expname = fits.getheader(jit_file,ext)['EXPNAME']\n", + " jit_tbl = Table(fits.getdata(jit_file, ext))\n", + " expname = fits.getheader(jit_file, ext)['EXPNAME']\n", " flt_file = glob.glob(f\"{exp_ids[0].lower()}/{expname[:8]}*flt.fits\")\n", " \n", " plt.figure(figsize=figure_size, dpi=dotsperinch)\n", " plt.grid(alpha=0.5)\n", - " plt.scatter(jit_tbl['Seconds'],jit_tbl['SI_V2_AVG'],15,alpha=.5,marker='o',label='V2_AVG')\n", - " plt.scatter(jit_tbl['Seconds'],jit_tbl['SI_V3_AVG'],15,alpha=.5,marker='o',label='V3_AVG')\n", - " plt.scatter(jit_tbl['Seconds'],jit_tbl['SI_V2_RMS'],10,alpha=.5,marker='s',label='V2_RMS')\n", - " plt.scatter(jit_tbl['Seconds'],jit_tbl['SI_V3_RMS'],10,alpha=.5,marker='s',label='V3_RMS')\n", - " \n", + " plt.scatter(jit_tbl['Seconds'], jit_tbl['SI_V2_AVG'], 15, alpha=.5, marker='o', label='V2_AVG')\n", + " plt.scatter(jit_tbl['Seconds'], jit_tbl['SI_V3_AVG'], 15, alpha=.5, marker='o', label='V3_AVG')\n", + " plt.scatter(jit_tbl['Seconds'], jit_tbl['SI_V2_RMS'], 10, alpha=.5, marker='s', label='V2_RMS')\n", + " plt.scatter(jit_tbl['Seconds'], jit_tbl['SI_V3_RMS'], 10, alpha=.5, marker='s', label='V3_RMS')\n", " \n", - " plt.xlabel('Exposure Time [Seconds]',size=13)\n", - " plt.ylabel('Coordinate Axis [Arcsec]',size=13)\n", - " plt.title(f\"Jitter File Ext Number: {ext}\\n Corresponding FLT: {flt_file[0].split('/')[-1]}\",size=14)\n", - " plt.legend(prop={'size':12},ncol=2)\n", + " plt.xlabel('Exposure Time [Seconds]', size=13)\n", + " plt.ylabel('Coordinate Axis [Arcsec]', size=13)\n", + " plt.title(f\"Jitter File Ext Number: {ext}\\n Corresponding FLT: {flt_file[0].split('/')[-1]}\", size=14)\n", + " plt.legend(prop={'size': 12}, ncol=2)\n", " plt.minorticks_on()" ] }, @@ -486,12 +485,17 @@ "Below, we show an example of searching for sources with `photutils.detection.DAOStarFinder`
\n", "in a 100x100 pixel subsection and subsequently plotting their radial profiles using the `RadialProfile`
class within the file `rad_prof.py`. \n", "\n", - "We also import:\n", - "- *numpy* for handling arrays\n", - "- *astropy.stats.sigma_clipped_stats* for sigma clipping statistics\n", - "- *ginga.util.zscale* for scaling images\n", - "- *matplotlib.colors.LogNorm* for logarithmic normalization\n", - "- *photutils.detection.CircularAperture* for aperture photometry" + "We also import:
\n", + "\n", + "| Package Name | Purpose |\n", + "|:---------------------------------------|:---------------------------|\n", + "| `numpy` | handling arrays |\n", + "| `astropy.stats.sigma_clipped_stats` | sigma clipping statistics |\n", + "| `astropy.visualization.ZScaleInterval` | z-scaling images |\n", + "| `matplotlib.colors.LogNorm` | logarithmic normalization |\n", + "| `photutils.detection.CircularAperture` | aperture photometry |\n", + "| `photutils.detection.DAOStarFinder` | point source detection |\n", + "| `docs.rad_prof.RadialProfile` | generating radial profiles |" ] }, { @@ -500,11 +504,10 @@ "metadata": {}, "outputs": [], "source": [ - "import numpy as np\n", - "\n", "from astropy.stats import sigma_clipped_stats\n", - "from ginga.util import zscale\n", + "from astropy.visualization import ZScaleInterval\n", "from matplotlib.colors import LogNorm\n", + "import numpy as np\n", "from photutils.aperture import CircularAperture\n", "from photutils.detection import DAOStarFinder\n", "\n", @@ -532,11 +535,11 @@ "source": [ "# Read in data\n", "filename = 'iepp01010/iepp01uvq_flt.fits'\n", - "uvis2 = fits.getdata(filename,'SCI',1)\n", + "uvis2 = fits.getdata(filename, 'SCI', 1)\n", "header = fits.getheader(filename)\n", "\n", "# Trim data to 100x100 subsection\n", - "data = uvis2[:100,65:165]\n", + "data = uvis2[:100, 65:165]\n", "\n", "# 3 sigma clip data to get median and std values\n", "mean, median, std = sigma_clipped_stats(data, sigma=3.0) \n", @@ -546,23 +549,24 @@ "sources = daofind(data - median) \n", "\n", "# Truncate list to show just a few sources \n", - "sources = sources[(sources['flux'] > 10) &\\\n", - " (sources['xcentroid'] > 10) & (sources['xcentroid'] < 90) &\\\n", - " (sources['ycentroid'] > 18) & (sources['ycentroid'] < 90) ]\n", + "sources = sources[(sources['flux'] > 10) &\n", + " (sources['xcentroid'] > 10) & (sources['xcentroid'] < 90) &\n", + " (sources['ycentroid'] > 18) & (sources['ycentroid'] < 90)]\n", "\n", "# Create circular apertures to plot\n", - "positions = np.transpose((sources['xcentroid'],sources['ycentroid']))\n", + "positions = np.transpose((sources['xcentroid'], sources['ycentroid']))\n", "apertures = CircularAperture(positions, r=5.)\n", "\n", "# Get zscale image min and max limits\n", - "z1,z2 = zscale.zscale(data)\n", + "z = ZScaleInterval()\n", + "z1, z2 = z.get_limits(data)\n", "\n", "# Plot 100x100 subsection and apertures\n", - "plt.figure(figsize=(15,10))\n", - "im1 = plt.imshow(data-z1+.01, origin='lower', cmap='Greys', norm = LogNorm(vmin=.01, vmax=z2*100.-z1) )\n", + "plt.figure(figsize=(15, 10))\n", + "im1 = plt.imshow(data-z1+.01, origin='lower', cmap='Greys', norm=LogNorm(vmin=.01, vmax=z2*100.-z1))\n", "apertures.plot(color='red', lw=1.5, alpha=0.5)\n", - "plt.title(filename,size=14)\n", - "plt.colorbar(im1,pad=0.01)\n" + "plt.title(filename, size=14)\n", + "plt.colorbar(im1, pad=0.01)" ] }, { @@ -584,17 +588,17 @@ "outputs": [], "source": [ "# Loop through sources and plot star stamp next to corresponding radial profile plot\n", - "for xy in zip(sources['xcentroid'],sources['ycentroid']):\n", + "for xy in zip(sources['xcentroid'], sources['ycentroid']):\n", " \n", - " fig, [ax1,ax2] = plt.subplots(1,2,figsize=(11,5))\n", + " fig, [ax1, ax2] = plt.subplots(1, 2, figsize=(11, 5))\n", " \n", " # Calculate radial profile and plot on ax2\n", - " my_prof = RadialProfile(xy[0],xy[1],data,\n", - " r=5,\n", - " fit=True,\n", - " recenter=True,\n", - " show=True,\n", - " ax=ax2)\n", + " my_prof = RadialProfile(xy[0], xy[1], data,\n", + " r=5,\n", + " fit=True,\n", + " recenter=True,\n", + " show=True,\n", + " ax=ax2)\n", " \n", " # Create boundaries for stamp \n", " x1 = int(round(my_prof.x-7))\n", @@ -603,10 +607,10 @@ " y2 = int(round(my_prof.y+7))\n", " \n", " # Plot star stamp \n", - " im1 = ax1.imshow(data[y1:y2,x1:x2]-z1+.01, origin='lower', cmap='Greys', extent= [x1,x2,y1,y2],norm = LogNorm(vmin=.01, vmax=z2*100.-z1) )\n", + " im1 = ax1.imshow(data[y1:y2, x1:x2]-z1+.01, origin='lower', cmap='Greys', extent=[x1, x2, y1, y2], norm=LogNorm(vmin=.01, vmax=z2*100.-z1))\n", "\n", - " ax1.set_title(f\"x = {my_prof.x:.3f}, y = {my_prof.y:.3f}\",size=13)\n", - " ax2.set_title(header['filter'],size=13)\n", + " ax1.set_title(f\"x = {my_prof.x:.3f}, y = {my_prof.y:.3f}\", size=13)\n", + " ax2.set_title(header['filter'], size=13)\n", " ax2.grid(alpha=0.5)\n", " fig.tight_layout()" ] @@ -671,19 +675,20 @@ "\n", "## About this Notebook \n", "\n", - "**Author:** Benjamin Kuhn, WFC3 Instrument\n", - "\n", - "**Updated On:** January 20, 2023\n", + "**Author:** Benjamin Kuhn, WFC3 Instrument
\n", + "**Updated On:** November 21, 2023\n", "\n", "## Citations \n", "\n", "If you use Python packages for published research, please cite the authors. Follow these links for more
\n", - "information about citing packages such as `astropy`, `astroquery`, `matplotlib`, or `photutils`:\n", + "information about citing packages such as `astropy`, `astroquery`, `matplotlib`, `photutils`, etc.:\n", "\n", "* [Citing `astropy`](https://www.astropy.org/acknowledging.html)\n", "* [Citing `astroquery`](https://github.com/astropy/astroquery/blob/main/astroquery/CITATION)\n", "* [Citing `matplotlib`](https://matplotlib.org/stable/users/project/citing.html)\n", + "* [Citing `numpy`](https://numpy.org/citing-numpy/)\n", "* [Citing `photutils`](https://photutils.readthedocs.io/en/stable/citation.html)\n", + "* [Citing `scipy`](https://scipy.org/citing-scipy/)\n", "
\n", "***" ] @@ -699,7 +704,7 @@ ], "metadata": { "kernelspec": { - "display_name": "Python 3", + "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, @@ -713,7 +718,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.7.5" + "version": "3.11.5" } }, "nbformat": 4, From 422c2065bfcc32aab577fe1f77499be74a6d702a Mon Sep 17 00:00:00 2001 From: Hatice Karatay <66814693+haticekaratay@users.noreply.github.com> Date: Mon, 27 Nov 2023 15:04:23 -0500 Subject: [PATCH 06/30] Enhance navigation clarity on index page (#140) * Enhance navigation clarity on index page * Update wording for better clarity --- index.md | 20 ++++++++++++++++---- 1 file changed, 16 insertions(+), 4 deletions(-) diff --git a/index.md b/index.md index d2ef21f85..ac70e8711 100644 --- a/index.md +++ b/index.md @@ -1,6 +1,19 @@ -# STScI HST Notebook Repository HQ -This page provides links to notebooks created by various Hubble Space -Telescope instrument teams and software groups, including: + +## STScI HST Notebook Repository HQ +Welcome to the STScI HST Notebook Repository +This resource provides comprehensive documentation and interactive notebooks created by the Hubble Space Telescope instruments teams. + +### Interactive Notebooks +Explore our interactive notebooks for hands-on experience with HST data. +- [ACS notebooks](./notebooks/ACS/README.md) +- [COS notebooks](./notebooks/COS/README.md) +- [DrizzlePac notebooks](./notebooks/DrizzlePac/README.md) +- [NICMOS notebooks](./notebooks/NICMOS/nicmos_unit_conversion/nicmos_unit_conversion.ipynb) +- [STIS notebooks](./notebooks/STIS/README.md) +- [WFC3 notebooks](./notebooks/WFC3/README.md) + +### Instrument Documentation +Here, you can find detailed documentation for each instrument the Hubble Space Telescope uses. - [Advanced Camera for Surveys (ACS)](https://www.stsci.edu/hst/instrumentation/acs) @@ -13,4 +26,3 @@ Telescope instrument teams and software groups, including: - [Space Telescope Imaging Spectrograph (STIS)](https://www.stsci.edu/hst/instrumentation/stis) - [Wide Field Camera 3 (WFC3)](https://www.stsci.edu/hst/instrumentation/wfc3) - From 0a07a7c05f3084e4a56002d155970cea378ab522 Mon Sep 17 00:00:00 2001 From: Michael Dulude Date: Tue, 28 Nov 2023 08:23:07 -0500 Subject: [PATCH 07/30] Add WFC3 notebook 'calwf3_recal_tvb.ipynb' (#99) * uncommented notebooks/WFC3/calwf3_recalibration/calwf3_recal_tvb.ipynb from _toc.yml * removed notebooks/WFC3/calwf3_recalibration/calwf3_recal_tvb.ipynb from exclude_patterns list in _config.yml * calwf3_recal_tvb.ipynb: cleared notebook outputs * Removed calwf3_recal notebook from _config * Uncommented calwf3_recal notebook from _toc * Update README for calwf3_recal * Update requirements for calwf3_recal * Added pre-requirements for calwf3_recal to install hstacl * Update calwf3_recal notebook to PEP8 standards * Updated README with dependency instructions * Updated code for file naming/moving * Fixed indentation error * Fix indentation in the loop * Fixed indentation bug --------- Co-authored-by: FDauphin Co-authored-by: Hatice Karatay <66814693+haticekaratay@users.noreply.github.com> --- _config.yml | 1 - _toc.yml | 2 +- notebooks/WFC3/calwf3_recalibration/README.md | 14 +- .../calwf3_recal_tvb.ipynb | 222 +++++++++++------- .../calwf3_recalibration/pre-requirements.sh | 1 + .../calwf3_recalibration/requirements.txt | 1 + 6 files changed, 148 insertions(+), 93 deletions(-) create mode 100644 notebooks/WFC3/calwf3_recalibration/pre-requirements.sh diff --git a/_config.yml b/_config.yml index e7f606c78..0008a5d0f 100644 --- a/_config.yml +++ b/_config.yml @@ -44,7 +44,6 @@ html: exclude_patterns: [notebooks/DrizzlePac/align_mosaics/align_mosaics.ipynb, notebooks/DrizzlePac/sky_matching/sky_matching.ipynb, notebooks/DrizzlePac/use_ds9_regions_in_tweakreg/use_ds9_regions_in_tweakreg.ipynb, - notebooks/WFC3/calwf3_recalibration/calwf3_recal_tvb.ipynb, notebooks/WFC3/dash/dash.ipynb, notebooks/WFC3/filter_transformations/filter_transformations.ipynb, notebooks/WFC3/flux_conversion_tool/flux_conversion_tool.ipynb, diff --git a/_toc.yml b/_toc.yml index 94dbd254f..8e6eb2729 100644 --- a/_toc.yml +++ b/_toc.yml @@ -59,7 +59,7 @@ parts: - caption: WFC3 chapters: - file: notebooks/WFC3/README.md -# - file: notebooks/WFC3/calwf3_recalibration/calwf3_recal_tvb.ipynb + - file: notebooks/WFC3/calwf3_recalibration/calwf3_recal_tvb.ipynb - file: notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb # - file: notebooks/WFC3/dash/dash.ipynb - file: notebooks/WFC3/exception_report/wfc3_exception_report.ipynb diff --git a/notebooks/WFC3/calwf3_recalibration/README.md b/notebooks/WFC3/calwf3_recalibration/README.md index bbf6fa072..1f7a1e0ca 100644 --- a/notebooks/WFC3/calwf3_recalibration/README.md +++ b/notebooks/WFC3/calwf3_recalibration/README.md @@ -1 +1,13 @@ -A new Jupyter notebook provides `calwf3` reprocessing examples to improve calibrated WFC3/IR images affected by time-variable background. The notebook shows how to diagnose images with poor-quality ramp fits and rerun `calwf3` with the 'CRCORR' step turned off. This method is described as the 'Last-minus-first' technique [WFC3 ISR 2016-16](https://www.stsci.edu/files/live/sites/www/files/home/hst/instrumentation/wfc3/documentation/instrument-science-reports-isrs/_documents/2016/WFC3-2016-16.pdf). See Section 3.5.2 of the [WFC3 Data Handbook](https://hst-docs.stsci.edu/wfc3dhb) for more information. +This Jupyter notebook provides `calwf3` reprocessing examples to improve calibrated WFC3/IR images affected by time-variable background (TVB). The notebook shows how to diagnose images with poor-quality ramp fits and rerun `calwf3` with the 'CRCORR' step turned off. This method is described as the 'Last-minus-first' technique [WFC3 ISR 2016-16](https://www.stsci.edu/files/live/sites/www/files/home/hst/instrumentation/wfc3/documentation/instrument-science-reports-isrs/_documents/2016/WFC3-2016-16.pdf). See Section 3.5.2 of the [WFC3 Data Handbook](https://hst-docs.stsci.edu/wfc3dhb) for more information. + +Dependencies: + +Install the necessary packages using the pre-requirements.sh and requirements.txt: + + bash pre-requirements.sh + pip install -r requirements.txt + +If necessary, also install jupyter notebook: + + pip install notebook + diff --git a/notebooks/WFC3/calwf3_recalibration/calwf3_recal_tvb.ipynb b/notebooks/WFC3/calwf3_recalibration/calwf3_recal_tvb.ipynb index c5a53486a..37ccba91e 100755 --- a/notebooks/WFC3/calwf3_recalibration/calwf3_recal_tvb.ipynb +++ b/notebooks/WFC3/calwf3_recalibration/calwf3_recal_tvb.ipynb @@ -52,23 +52,21 @@ "\n", "## 1. Imports\n", "\n", - "This notebook assumes you have created the virtual environment in [WFC3 Library's](https://github.com/spacetelescope/WFC3Library) installation instructions.\n", + "This notebook assumes you have installed the required libraries as described [here](https://github.com/spacetelescope/hst_notebooks/tree/main/notebooks/WFC3/calwf3_recalibration).\n", "\n", "We import:\n", - "- *os* for setting environment variables\n", "- *glob* for finding lists of files\n", + "- *os* for setting environment variables\n", "- *shutil* for managing directories\n", "\n", - "- *numpy* for handling array functions\n", "- *matplotlib.pyplot* for plotting data\n", "- *astropy.io fits* for accessing FITS files\n", "- *astroquery.mast Observations* for downloading data from MAST\n", - "\n", - "- *wfc3tools pstat* for plotting statistics of WFC3 data\n", - "- *wfc3tools calwf3* for calibrating WFC3 data\n", "- *ccdproc* for building the association\n", + "- *drizzlepac astrodrizzle* for combining images\n", "- *stwcs* for updating the World Coordinate System\n", - "- *drizzlepac astrodrizzle* for combining images" + "\n", + "- *wfc3tools calwf3 and pstat* for calibrating WFC3 data and plotting statistics of WFC3 data" ] }, { @@ -79,20 +77,19 @@ "source": [ "%matplotlib inline\n", "\n", - "import os\n", "import glob\n", + "import os\n", "import shutil \n", "\n", - "import numpy as np\n", "import matplotlib.pyplot as plt\n", + "\n", "from astropy.io import fits\n", "from astroquery.mast import Observations\n", - "\n", - "from wfc3tools import pstat\n", - "from wfc3tools import calwf3\n", "from ccdproc import ImageFileCollection\n", + "from drizzlepac import astrodrizzle\n", "from stwcs import updatewcs\n", - "from drizzlepac import astrodrizzle" + "\n", + "from wfc3tools import calwf3, pstat" ] }, { @@ -110,22 +107,32 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "scrolled": true - }, + "metadata": {}, "outputs": [], "source": [ "data_list = Observations.query_criteria(obs_id='IBOHBF040')\n", "\n", - "Observations.download_products(data_list['obsid'],project='CALWF3',download_dir='./data',\n", - " mrp_only=False,productSubGroupDescription=['ASN','RAW','IMA','FLT','DRZ'])\n", + "Observations.download_products(\n", + " data_list['obsid'], \n", + " project='CALWF3', \n", + " download_dir='./data', \n", + " mrp_only=False, \n", + " productSubGroupDescription=['ASN', 'RAW', 'IMA', 'FLT', 'DRZ'])\n", "\n", "science_files = glob.glob('data/mastDownload/HST/*/*fits')\n", "\n", "for im in science_files:\n", - " root = im.split('/')[-1]\n", - " os.rename(im,'./'+root)\n", - "shutil.rmtree('data/')" + " root = os.path.basename(im)\n", + " new_path = os.path.join('.', root)\n", + " os.rename(im, new_path)\n", + "\n", + "data_directory = './data'\n", + "\n", + "try:\n", + " if os.path.isdir(data_directory):\n", + " shutil.rmtree(data_directory)\n", + "except Exception as e:\n", + " print(f\"An error occured while deleting the directory {data_directory}: {e}\")" ] }, { @@ -141,11 +148,32 @@ "metadata": {}, "outputs": [], "source": [ - "collec = ImageFileCollection('./',\n", - " keywords=[\"asn_id\",\"targname\",\"filter\",\"samp_seq\",\"nsamp\",\"exptime\",\n", - " \"postarg1\",\"postarg2\",\"date-obs\",\"time-obs\",], glob_include=\"*flt.fits\", ext=0)\n", - "out_table = collec.summary\n", - "out_table" + "image_collection = ImageFileCollection(\n", + " './',\n", + " keywords=[\n", + " \"asn_id\",\n", + " \"targname\",\n", + " \"filter\",\n", + " \"samp_seq\",\n", + " \"nsamp\",\n", + " \"exptime\",\n", + " \"postarg1\",\n", + " \"postarg2\",\n", + " \"date-obs\",\n", + " \"time-obs\",\n", + " ], \n", + " glob_include=\"*flt.fits\",\n", + " ext=0,\n", + ")\n", + "\n", + "try:\n", + " summary_table = image_collection.summary\n", + " if summary_table:\n", + " print(summary_table)\n", + " else:\n", + " print(\"No FITS files matched the pattern or no relevant data found.\")\n", + "except Exception as e:\n", + " print(f\"An error occurred while creating the summary table: {e}\")" ] }, { @@ -194,7 +222,7 @@ "raw_files = glob.glob('*_raw.fits')\n", "\n", "for file in raw_files:\n", - " command_line_input = 'crds bestrefs --files {:} --sync-references=1 --update-bestrefs'.format(file)\n", + " command_line_input = f'crds bestrefs --files {file} --sync-references=1 --update-bestrefs'\n", " os.system(command_line_input)" ] }, @@ -226,7 +254,7 @@ "metadata": {}, "outputs": [], "source": [ - "fits.getdata('ibohbf040_asn.fits',1)" + "fits.getdata('ibohbf040_asn.fits', 1)" ] }, { @@ -245,15 +273,15 @@ "b7q_data = fits.getdata('ibohbfb7q_flt.fits', ext=1)\n", "b9q_data = fits.getdata('ibohbfb9q_flt.fits', ext=1)\n", "\n", - "fig = plt.figure(figsize=(15,8))\n", - "ax1 = fig.add_subplot(1,2,1)\n", - "ax2 = fig.add_subplot(1,2,2)\n", + "fig = plt.figure(figsize=(15, 8))\n", + "ax1 = fig.add_subplot(1, 2, 1)\n", + "ax2 = fig.add_subplot(1, 2, 2)\n", "\n", - "ax1.imshow(b7q_data, vmin=0.25,vmax=1.25,cmap='Greys_r',origin='lower')\n", - "ax2.imshow(b9q_data, vmin=1.25,vmax=2.25,cmap='Greys_r',origin='lower')\n", + "ax1.imshow(b7q_data, vmin=0.25, vmax=1.25, cmap='Greys_r', origin='lower')\n", + "ax2.imshow(b9q_data, vmin=1.25, vmax=2.25, cmap='Greys_r', origin='lower')\n", "\n", - "ax1.set_title('ibohbfb7q (Linear Bkg)',fontsize=20)\n", - "ax2.set_title('ibohbfb9q (Non-linear Bkg)',fontsize=20)" + "ax1.set_title('ibohbfb7q (Linear Bkg)', fontsize=20)\n", + "ax2.set_title('ibohbfb9q (Non-linear Bkg)', fontsize=20)" ] }, { @@ -271,15 +299,20 @@ }, "outputs": [], "source": [ - "fig = plt.figure(figsize=(15,3))\n", - "ax1 = fig.add_subplot(1,2,1)\n", - "ax2 = fig.add_subplot(1,2,2)\n", + "fig = plt.figure(figsize=(15, 3))\n", + "ax1 = fig.add_subplot(1, 2, 1)\n", + "ax2 = fig.add_subplot(1, 2, 2)\n", + "\n", + "n, bins, patches = ax1.hist(b7q_data.flatten(), bins=200, range=(0, 1))\n", + "n, bins, patches = ax2.hist(b9q_data.flatten(), bins=200, range=(1, 2))\n", "\n", - "n, bins, patches = ax1.hist(b7q_data.flatten(),bins=200,range=(0,1))\n", - "n, bins, patches = ax2.hist(b9q_data.flatten(),bins=200,range=(1,2))\n", + "ax1.set_title('ibohbfb7q (Linear Bkg)', fontsize=15)\n", + "ax1.set_xlabel('Count Rate (e-/s)')\n", + "ax1.set_ylabel('Frequency')\n", "\n", - "ax1.set_title('ibohbfb7q (Linear Bkg)',fontsize=15)\n", - "ax2.set_title('ibohbfb9q (Non-linear Bkg)',fontsize=15)" + "ax2.set_title('ibohbfb9q (Non-linear Bkg)', fontsize=15)\n", + "ax2.set_xlabel('Count Rate (e-/s)')\n", + "ax2.set_ylabel('Frequency')" ] }, { @@ -357,14 +390,14 @@ "metadata": {}, "outputs": [], "source": [ - "os.mkdir('orig/')\n", + "os.makedirs('orig/', exist_ok=True)\n", "\n", - "for imas in glob.glob('ibohbf*_ima.fits'):\n", - " shutil.move(imas,'orig/')\n", - "for flts in glob.glob('ibohbf*_flt.fits'):\n", - " shutil.move(flts,'orig/') \n", - "for driz in glob.glob('ibohbf*_drz.fits'):\n", - " shutil.move(driz,'orig/') " + "for file_pattern in ['ibohbf*_ima.fits', 'ibohbf*_flt.fits', 'ibohbf*_drz.fits']:\n", + " for file in glob.glob(file_pattern):\n", + " destination_path = os.path.join('orig', os.path.basename(file))\n", + " if os.path.isfile(destination_path):\n", + " os.remove(destination_path)\n", + " shutil.move(file, destination_path) " ] }, { @@ -398,18 +431,18 @@ "metadata": {}, "outputs": [], "source": [ - "b9q_data = fits.getdata('orig/ibohbfb9q_flt.fits', ext=1)\n", - "b9q_newdata = fits.getdata('ibohbfb9q_flt.fits', ext=1)\n", + "b9q_data = fits.getdata('orig/ibohbfb9q_flt.fits', ext=1)\n", + "b9q_newdata = fits.getdata('ibohbfb9q_flt.fits', ext=1)\n", "\n", - "fig = plt.figure(figsize=(15,8))\n", - "ax1 = fig.add_subplot(1,2,1)\n", - "ax2 = fig.add_subplot(1,2,2)\n", + "fig = plt.figure(figsize=(15, 8))\n", + "ax1 = fig.add_subplot(1, 2, 1)\n", + "ax2 = fig.add_subplot(1, 2, 2)\n", "\n", - "ax1.imshow(b9q_data[520:720,750:970], vmin=1.25,vmax=2.25,cmap='Greys_r',origin='lower')\n", - "ax2.imshow(b9q_newdata[520:720,750:970], vmin=1.25,vmax=2.25,cmap='Greys_r',origin='lower')\n", + "ax1.imshow(b9q_data[520:720, 750:970], vmin=1.25, vmax=2.25, cmap='Greys_r', origin='lower')\n", + "ax2.imshow(b9q_newdata[520:720, 750:970], vmin=1.25, vmax=2.25, cmap='Greys_r', origin='lower')\n", "\n", - "ax1.set_title('ibohbfb9q (Original)', fontsize=20)\n", - "ax2.set_title('ibohbfb9q (Reprocessed)',fontsize=20)" + "ax1.set_title('ibohbfb9q (Original)', fontsize=20)\n", + "ax2.set_title('ibohbfb9q (Reprocessed)', fontsize=20)" ] }, { @@ -427,15 +460,20 @@ }, "outputs": [], "source": [ - "fig = plt.figure(figsize=(15,3))\n", - "ax1 = fig.add_subplot(1,2,1)\n", - "ax2 = fig.add_subplot(1,2,2)\n", + "fig = plt.figure(figsize=(15, 3))\n", + "ax1 = fig.add_subplot(1, 2, 1)\n", + "ax2 = fig.add_subplot(1, 2, 2)\n", "\n", - "n, bins, patches = ax1.hist(b9q_data.flatten(), bins=200,range=(1,2))\n", - "n, bins, patches = ax2.hist(b9q_newdata.flatten(),bins=200,range=(1,2))\n", + "n, bins, patches = ax1.hist(b9q_data.flatten(), bins=200, range=(1, 2))\n", + "n, bins, patches = ax2.hist(b9q_newdata.flatten(), bins=200, range=(1, 2))\n", "\n", - "ax1.set_title('ibohbfb9q (Original FLT)', fontsize=15)\n", - "ax2.set_title('ibohbfb9q (Reprocessed FLT)',fontsize=15)" + "ax1.set_title('ibohbfb9q (Original FLT)', fontsize=15)\n", + "ax1.set_xlabel('Count Rate (e-/s)')\n", + "ax1.set_ylabel('Frequency')\n", + "\n", + "ax2.set_title('ibohbfb9q (Reprocessed FLT)', fontsize=15)\n", + "ax2.set_xlabel('Count Rate (e-/s)')\n", + "ax2.set_ylabel('Frequency')" ] }, { @@ -468,7 +506,7 @@ "metadata": {}, "outputs": [], "source": [ - "dat = fits.getdata('ibohbf040_asn.fits',1)\n", + "dat = fits.getdata('ibohbf040_asn.fits', 1)\n", "dat" ] }, @@ -558,10 +596,10 @@ "source": [ "calwf3('ibohbf040_asn.fits')\n", "\n", - "#Alternatively, calwf3 may be run on a list of RAW files rather than the ASN\n", + "# Alternatively, calwf3 may be run on a list of RAW files rather than the ASN\n", "\n", - "#for raws in glob.glob('ibohbf*_raw.fits'):\n", - "# calwf3(raws)" + "# for raws in glob.glob('ibohbf*_raw.fits'):\n", + "# calwf3(raws)" ] }, { @@ -617,17 +655,17 @@ "outputs": [], "source": [ "drz_origdata = fits.getdata('orig/ibohbf040_drz.fits', ext=1)\n", - "drz_newdata = fits.getdata('ibohbf040_drz.fits', ext=1)\n", + "drz_newdata = fits.getdata('ibohbf040_drz.fits', ext=1)\n", "\n", - "fig = plt.figure(figsize=(15,8))\n", - "ax1 = fig.add_subplot(1,2,1)\n", - "ax2 = fig.add_subplot(1,2,2)\n", + "fig = plt.figure(figsize=(15, 8))\n", + "ax1 = fig.add_subplot(1, 2, 1)\n", + "ax2 = fig.add_subplot(1, 2, 2)\n", "\n", - "ax1.imshow(drz_origdata[520:720,750:970], vmin=0.4,vmax=0.6,cmap='Greys_r',origin='lower')\n", - "ax2.imshow(drz_newdata[520:720,750:970], vmin=0.4,vmax=0.6,cmap='Greys_r',origin='lower')\n", + "ax1.imshow(drz_origdata[520:720, 750:970], vmin=0.4, vmax=0.6, cmap='Greys_r', origin='lower')\n", + "ax2.imshow(drz_newdata[520:720, 750:970], vmin=0.4, vmax=0.6, cmap='Greys_r', origin='lower')\n", "\n", - "ax1.set_title('Original DRZ',fontsize=20)\n", - "ax2.set_title('Reprocessed DRZ',fontsize=20)" + "ax1.set_title('Original DRZ', fontsize=20)\n", + "ax2.set_title('Reprocessed DRZ', fontsize=20)" ] }, { @@ -636,15 +674,15 @@ "metadata": {}, "outputs": [], "source": [ - "fig = plt.figure(figsize=(15,3))\n", - "ax1 = fig.add_subplot(1,2,1)\n", - "ax2 = fig.add_subplot(1,2,2)\n", + "fig = plt.figure(figsize=(15, 3))\n", + "ax1 = fig.add_subplot(1, 2, 1)\n", + "ax2 = fig.add_subplot(1, 2, 2)\n", "\n", - "n, bins, patches = ax1.hist(drz_origdata.flatten(),bins=200,range=(0.4,0.52))\n", - "n, bins, patches = ax2.hist(drz_newdata.flatten(), bins=200,range=(0.4,0.52))\n", + "n, bins, patches = ax1.hist(drz_origdata.flatten(), bins=200, range=(0.4, 0.52))\n", + "n, bins, patches = ax2.hist(drz_newdata.flatten(), bins=200, range=(0.4, 0.52))\n", "\n", "ax1.set_title('Original DRZ', fontsize=15)\n", - "ax2.set_title('Reprocessed DRZ',fontsize=15)" + "ax2.set_title('Reprocessed DRZ', fontsize=15)" ] }, { @@ -683,19 +721,23 @@ "\n", "**Authors:** Jennifer Mack, Harish Khandrika; WFC3 Instrument Team\n", "\n", - "**Updated on:** 2021-09-13\n", + "**Created on:** 2021-09-13\n", + "\n", + "**Updated on:** 2023-11-16\n", + "\n", + "**Source:** The notebook is sourced from [hst_notebooks/notebooks/WFC3/calwf3_recalibration](https://github.com/spacetelescope/hst_notebooks/tree/main/notebooks/WFC3/calwf3_recalibration).\n", "\n", "\n", "## Citations\n", "\n", - "If you use `numpy`, `astropy`, `astroquery`, `wfc3tools`, or `drizzlepac` for published research, please cite the\n", + "If you use `matplotlib`, `astropy`, `astroquery`, `drizzlepac`, or `wfc3tools` for published research, please cite the\n", "authors. Follow these links for more information about citing the libraries below:\n", "\n", - "* [Citing `numpy`](https://numpy.org/citing-numpy/)\n", + "* [Citing `matplotlib`](https://matplotlib.org/stable/users/project/citing.html)\n", "* [Citing `astropy`](https://www.astropy.org/acknowledging.html)\n", - "* [Citing `astroquery`](https://astroquery.readthedocs.io/en/latest/)\n", - "* [Citing `wfc3tools`](https://wfc3tools.readthedocs.io/en/latest/)\n", + "* [Citing `astroquery`](https://astroquery.readthedocs.io/en/latest/license.html)\n", "* [Citing `drizzlepac`](https://drizzlepac.readthedocs.io/en/latest/LICENSE.html)\n", + "* [Citing `wfc3tools`](https://wfc3tools.readthedocs.io/en/latest/)\n", "\n", "***\n", "[Top of Page](#title)\n", @@ -712,7 +754,7 @@ ], "metadata": { "kernelspec": { - "display_name": "Python 3", + "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, @@ -726,7 +768,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.7.5" + "version": "3.8.12" } }, "nbformat": 4, diff --git a/notebooks/WFC3/calwf3_recalibration/pre-requirements.sh b/notebooks/WFC3/calwf3_recalibration/pre-requirements.sh new file mode 100644 index 000000000..23a5dffae --- /dev/null +++ b/notebooks/WFC3/calwf3_recalibration/pre-requirements.sh @@ -0,0 +1 @@ +conda install --yes -c conda-forge hstcal \ No newline at end of file diff --git a/notebooks/WFC3/calwf3_recalibration/requirements.txt b/notebooks/WFC3/calwf3_recalibration/requirements.txt index 26df866c6..33e20b5ab 100644 --- a/notebooks/WFC3/calwf3_recalibration/requirements.txt +++ b/notebooks/WFC3/calwf3_recalibration/requirements.txt @@ -1,6 +1,7 @@ astropy==5.2.1 astroquery==0.4.6 ccdproc==2.4.0 +crds==11.17.9 drizzlepac==3.5.1 matplotlib==3.7.0 numpy==1.23.4 From 5716becfd07d27222dc43bcc18ec09be68b4450f Mon Sep 17 00:00:00 2001 From: dulude Date: Tue, 28 Nov 2023 15:04:50 -0500 Subject: [PATCH 08/30] wfc3_exception_report.ipynb: fixed image rendering --- .../WFC3/exception_report/wfc3_exception_report.ipynb | 8 +++++--- 1 file changed, 5 insertions(+), 3 deletions(-) diff --git a/notebooks/WFC3/exception_report/wfc3_exception_report.ipynb b/notebooks/WFC3/exception_report/wfc3_exception_report.ipynb index c1089bb4a..f5199043e 100755 --- a/notebooks/WFC3/exception_report/wfc3_exception_report.ipynb +++ b/notebooks/WFC3/exception_report/wfc3_exception_report.ipynb @@ -79,7 +79,8 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "" + "\n", + "![typical_images.png](docs/typical_images.png)" ] }, { @@ -95,7 +96,8 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "\n" + "\n", + "![flt_vs_flc.gif](docs/flt_vs_flc.gif)\n" ] }, { @@ -718,7 +720,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.11.5" + "version": "3.8.12" } }, "nbformat": 4, From d5401f2bd468b300e5388de222827a5d863cf3ae Mon Sep 17 00:00:00 2001 From: Michael Dulude Date: Thu, 30 Nov 2023 15:25:26 -0500 Subject: [PATCH 09/30] Create weekly_broken_link_finder.yml --- .github/workflows/weekly_broken_link_finder.yml | 8 ++++++++ 1 file changed, 8 insertions(+) create mode 100644 .github/workflows/weekly_broken_link_finder.yml diff --git a/.github/workflows/weekly_broken_link_finder.yml b/.github/workflows/weekly_broken_link_finder.yml new file mode 100644 index 000000000..9b38dd56a --- /dev/null +++ b/.github/workflows/weekly_broken_link_finder.yml @@ -0,0 +1,8 @@ +name: Weekly broken link check +on: + schedule: + - cron: '0 4 * * 0' # 0400 UTC every Sunday + +jobs: + Scheduled: + uses: spacetelescope/notebook-ci-actions/.github/workflows/broken_link_checker.yml@main From 395bb050fb1be579eefd44b4a827c26d7d50c526 Mon Sep 17 00:00:00 2001 From: Michael Dulude Date: Fri, 1 Dec 2023 12:52:23 -0500 Subject: [PATCH 10/30] Updated syntax used to display images so they properly render in generated HTML pages (#149) * AsnFile.ipynb: replaced HTML image display dispay with markdown so image will render in HTML version of notebook. * CalCOS.ipynb: replaced HTML image display with markdown so image will render in HTML version of notebook. * DataDl.ipynb: replaced HTML image display code with markdown so images will render in HTML version of notebook. * DayNight.ipynb: replaced HTML image display code with markdown so image will render in HTML version of notebook. * LSF.ipynb: replaced HTML image display code with markdown so images will render in HTML version of notebook. * Setup.ipynb: replaced HTML image display code with markdown so images will render in HTML version of notebook. * ViewData.ipynb: replaced HTML image display code with markdown so images will render in HTML version of notebook. * calstis_2d_ccd.ipynb: replaced HTML image display code with markdown so images will render in HTML version of notebook. * STIS_Coronagraphy_Visualization_v2.ipynb: replaced HTML image display code with markdown so images will render in HTML version of notebook. * STIS_DrizzlePac_Tutorial.ipynb: replaced HTML image display code with markdown so images will render in HTML version of notebook. * calwf3_with_v1.0_PCTE.ipynb: replaced HTML image display code with markdown so images will render in HTML version of notebook. * AsnFile.ipynb: Fixed PEP8 issue. * LSF.ipynb: Fixed PEP8 issue. * calstis_2d_ccd.ipynb: added title to image. --- notebooks/COS/AsnFile/AsnFile.ipynb | 8 ++- notebooks/COS/CalCOS/CalCOS.ipynb | 6 +- notebooks/COS/DataDl/DataDl.ipynb | 69 ++++++++++++++----- notebooks/COS/DayNight/DayNight.ipynb | 6 +- notebooks/COS/LSF/LSF.ipynb | 11 +-- notebooks/COS/Setup/Setup.ipynb | 15 ++-- notebooks/COS/ViewData/ViewData.ipynb | 22 ++++-- .../STIS_Coronagraphy_Visualization_v2.ipynb | 6 +- notebooks/STIS/calstis/calstis_2d_ccd.ipynb | 14 ++-- .../STIS_DrizzlePac_Tutorial.ipynb | 21 ++++-- .../calwf3_with_v1.0_PCTE.ipynb | 13 ++-- 11 files changed, 134 insertions(+), 57 deletions(-) diff --git a/notebooks/COS/AsnFile/AsnFile.ipynb b/notebooks/COS/AsnFile/AsnFile.ipynb index 2c6b04e53..fbdf4fffa 100644 --- a/notebooks/COS/AsnFile/AsnFile.ipynb +++ b/notebooks/COS/AsnFile/AsnFile.ipynb @@ -307,7 +307,9 @@ "\n", "### Figure 1. Comparison of fluxes between the data retrieved from MAST, and the same data reprocessed after removing the bad exposure\n", "\n", - "A comparison between the fluxes of our data. One dataset if without our bad exposure removed, and the other is with our bad exposure removed. Both plots are relatively the same, as they should be." + "\n", + "\n", + "![A comparison between the fluxes of our data. One dataset if without our bad exposure removed, and the other is with our bad exposure removed. Both plots are relatively the same, as they should be.](./figures/compare_fluxes_after_removing_badfile.png \"Comparison of fluxes\")" ] }, { @@ -905,7 +907,7 @@ " \"Exposure_type\": rawtag_a_exptypes,\n", " # Date in MJD\n", " \"Exposure_start_date\": rawtag_a_expstart_times,\n", - " \"Seconds_since_first_exposure\":\\\n", + " \"Seconds_since_first_exposure\": \\\n", " # Convert time since the first exposure into seconds\n", " 86400*np.subtract(rawtag_a_expstart_times, min(rawtag_a_expstart_times))\n", "})" @@ -1112,7 +1114,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.9.13" + "version": "3.8.12" }, "nbdime-conflicts": { "local_diff": [ diff --git a/notebooks/COS/CalCOS/CalCOS.ipynb b/notebooks/COS/CalCOS/CalCOS.ipynb index 5ffb8dc23..9ec563ecd 100644 --- a/notebooks/COS/CalCOS/CalCOS.ipynb +++ b/notebooks/COS/CalCOS/CalCOS.ipynb @@ -420,7 +420,9 @@ "\n", "Caution!\n", " \n", - "\"A \n", + "\n", + "\n", + "![A warning symbol. Watch out!](./figures/warning.png \"CAUTION!\")\n", "\n", "*Note* that as of the time of this Notebook's update, the pipeline context used below was **`hst_1071.pmap`**, but this changes over time. You are running this in the future, and there is certainly a newer context you would be better off working with. Take a minute to consider this, and check the [HST Calibration Reference Data System webpage](http://hst-crds.stsci.edu/) to determine what the **current operational pmap file** is. " ] @@ -1096,7 +1098,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.9.13" + "version": "3.8.12" } }, "nbformat": 4, diff --git a/notebooks/COS/DataDl/DataDl.ipynb b/notebooks/COS/DataDl/DataDl.ipynb index f4f2455fd..1afce398c 100644 --- a/notebooks/COS/DataDl/DataDl.ipynb +++ b/notebooks/COS/DataDl/DataDl.ipynb @@ -193,7 +193,9 @@ "source": [ "The search page of the HST-specific interface is laid out as in Figure 1.1:\n", "### Fig 1.1\n", - "
\"An
\n", + "\n", + "\n", + "![An image of the MAST search site. On the page there are boxes for each different criteria that you want in your search, such as Object Name in the top left, active instruments below that, and at the bottom of the page there is a box that you can fill with a certain column name and your desired criteria (if the criteria is not already on the page). At the top right is the 'My ST' button.](./figures/new_hst_search_login.png \"New HST-specific search website\")\n", "\n", "If you are accessing proprietary data, you will need to make an account or log in at the top right under \"MY ST\" (Fig 1.1, boxed in red). If you are accessing non-proprietary data, you may continue without logging in.\n" ] @@ -209,7 +211,9 @@ "* Are taken with the COS instrument, using the G160M grating and either the 1533 or the 1577 cenwave setting\n", "\n", "### Fig 1.2\n", - "
\"\"
" + "\n", + "\n", + "![New HST-specific website search filled out with a COS data query, boxes with each criteria from above are highlighted with a red box. The central wavelength condition was added to the bottom box where you choose columns for criteria, since the central wavelength is not a pre-given criteria.](figures/hst_search_query_updated.png \"New HST-specific website search filled out with a COS data query\")" ] }, { @@ -219,7 +223,9 @@ "The above search results in the table shown in Figure 1.3. \n", "\n", "### Fig 1.3\n", - "
\"The
\n", + "\n", + "\n", + "![The page is now the search results page. At the top left are the 'Edit Search' box highlighted in dashed red. Below that is the 'Download Dataset' button highlighted with a green circle. Under that is the list of datasets, with each row beginning with a empty checkbox, followed by information about the dataset, such as search position, dataset name, target name, etc.](figures/new_hst_search_results.png \"Results from new HST-specific search website query\")\n", "\n", "If you need to change some parameters in your search - for instance, to also find data from the G130M grating - click on \"Edit Search\" (Fig 1.3, red dashed box).\n", "\n", @@ -234,7 +240,9 @@ "Most COS spectra have preview images (simple plots of flux by wavelength) which can be viewed before downloading the data. Clicking the dataset name (Fig 1.3, blue dashed oval) will take you to a page which shows the preview image, as well as some basic information about the data and whether there were any known failures during the operation. An example of such a page is shown in Figure 1.4.\n", "\n", "### Fig 1.4\n", - "
\"The
" + "\n", + "\n", + "![The page is now an image of a spectrum preview for dataset LPXK51020. It shows two plots of wavelength vs flux. Below is a list of exposure information, such as observation data, exposure time, release date, mode. At the very bottom of the page is the proposal ID along with the PI and CoIs.](figures/preview_spectrum_small.png \"Preview spectrum page\")" ] }, { @@ -244,10 +252,15 @@ "Returning to the results page shown in Fig 1.3 and clicking \"Download Data\" opens a window as shown in Figure 1.5. In this window you can search for filetypes using the search bar, and unselect/select all the data products shown in the filtered list (Fig 1.5, green circle). Clicking the \"Showing all filetypes\" box (Figure 1.5, red box) shows a drop-down (Figure 1.6) where you can choose to show/hide certain types of data such as the uncalibrated data. \n", "\n", "### Fig 1.5\n", - "
\"The
\n", + "\n", + "\n", + "![The image shows the a table of files in dataset LDJ1010. Each row shows different CalCOS products, such as the ASN file, X1DSUM, JIT, etc. The title of the table is the dataset name, and to the left of the name is the button to check all or none exposures, highlighted with a green circle. There are columns on the table that give information about the insturment used (COS) and the filter/grating (G140L). Above the table is a dropdown that allows you to choose the filetypes you wish to download.](figures/new_hst_search_downloading_all_files.png \"Choosing what to download in the new HST search website\")\n", "\n", "### Fig 1.6\n", - "
\"The
\n", + "\n", + "\n", + "![The image shows the same page as Figure 1.5, but the 'Showing all file types' dropdown button has been clicked. There are rows that have checkboxes on the left, each row from top to bottom is labled 'Calibrated', 'Uncalibrated', 'Aux/Other', 'Reference', 'Log/Jitter'](figures/new_hst_search_downloading_filetypes.png \"Choosing which data to download (calibrated/uncalibrated/etc.\")\n", "\n", "When all of your desired data products are checked, click \"Start Download\" (Fig 1.5, yellow dashed box). This will download a compressed \"zipped\" folder of all of your data, divided into subdirectories by the observation. Most operating systems can decompress these folders by default. For help decompressing the zipped files, you can follow these links for: [Windows](https://support.microsoft.com/en-us/windows/zip-and-unzip-files-8d28fa72-f2f9-712f-67df-f80cf89fd4e5) and [Mac](https://support.apple.com/guide/mac-help/zip-and-unzip-files-and-folders-on-mac-mchlp2528/mac). There are numerous ways to do this on Linux, however we have not vetted them." ] @@ -264,7 +277,9 @@ "\n", "### Fig 1.7\n", "\n", - "
\"The
" + "\n", + "![The image is showing the search page for MAST again. Halfway down the page is a box to input dataset name criteria. Above that is the obervation type box, which is composed of three check boxes. These boxes from left to right are 'All', 'Science', and 'Calibration'. The science box is the only one that is checked.](figures/new_hst_search_query2_small.png \"New HST-specific website search filled out with a specific dataset ID\")" ] }, { @@ -319,7 +334,7 @@ }, { "cell_type": "code", - "execution_count": null, + "execution_count": 1, "metadata": {}, "outputs": [], "source": [ @@ -347,7 +362,9 @@ "Because we are searching by Dataset ID, we don't need to specify any additional parameters to narrow down the data.\n", "\n", "### Fig 1.8\n", - "
\"The
\n" + "\n", + "\n", + "![The image shows the MAST search page, with the top box filled in. The title of this box is 'Object name(s) and/or RA and Dec pair(s). The filled in box is the text file obsId_list.txt. The observations box below it has only the science box checked off.](figures/new_search_file_list_small.png \"File Upload Search Form\")\n" ] }, { @@ -357,7 +374,9 @@ "We now can access all the datasets specified in `obsId_list.txt`, as shown in Figure 1.9:\n", "\n", "### Fig 1.9\n", - "
\"The
\n", + "\n", + "\n", + "![The image shows the MAST search results page, with the three datasets from our text file shows in each row.](figures/new_search_file_list_res_small.png \"Upload List of Objects Search Results\")\n", "\n", "We can select and download their data products as before." ] @@ -375,11 +394,15 @@ "\n", "Navigate to the MAST Portal at , and you will be greeted by a screen where the top looks like Figure 1.10. \n", "### Fig 1.10\n", - "
\"The
\n", + "\n", + "\n", + "![The image shows the MAST portal, specifically the search box at the top.](figures/mastp_top.png \"Top of MAST Portal Home\")\n", "\n", "Click on \"Advanced Search\" (boxed in red in Figure 1.10). This will open up a new search tab, as shown in Figure 1.11:\n", "### Fig 1.11\n", - "
\"The
\n", + "\n", + "\n", + "![The image shows the advanced search pop-up. On the left of the pop-up is a list of check boxes for different search criteria, such as Target name, instrument, mission, etc. To the left of that are larger boxes for different critera, and inside these mobes are rows with checkboxes. For example, there is a larger box labeled 'Observation type' with two checkbox rows labeled 'science' and 'calibration'.](figures/mastp_adv.png \"The advanced search tab\")\n", "\n", "Fig 1.11 (above) shows the default search fields which appear. Depending on what you are looking for, these may or may not be the most helpful search fields. By unchecking some of the fields which we are not interested in searching by right now (Figure 1.12, boxed in green), and then entering the parameter values by which to narrow the search into each parameter's box, we generate Fig 1.12. One of the six fields (Mission) by which we are narrowing is boxed in a dashed blue line. The list of applied filters is boxed in red. A dashed pink box at the top left indicates that 2 records were found matching all of these parameters. To its left is an orange box around the \"Search\" button to press to bring up the list of results.\n", "\n", @@ -395,7 +418,9 @@ "|Product Type|spectrum|\n", "\n", "### Fig 1.12\n", - "
\"The
\n", + "\n", + "\n", + "![The image shows the same advanced search pop-up, filled out with our desired criteria. The criteria is highlighted, and above it is a list of the applied filters highlighted in red. The top of the pop-up has a 'Search' button, and also lists the records found. There are two records for our search.](figures/mastp_adv_2.png \"The advanced search tab with some selections\")\n", "\n" ] }, @@ -406,7 +431,9 @@ "Click the \"Search\" button (boxed in orange), and you will be brought to a page resembling Figure 1.13. \n", "\n", "### Fig 1.13\n", - "
\"The
" + "\n", + "\n", + "![The image shows the search results list, with consists of two rows for our search. To the left of the rows are checkboxes, then there is are images of a disk, spectra, and three dots. The rest of the columns of the rows show some criteria, such as the mission, observation type, etc. To the far right is the image of our object.](figures/mastp_res1.png \"Results of MAST Portal search\")" ] }, { @@ -443,7 +470,9 @@ "metadata": {}, "source": [ "### Fig 1.14\n", - "
\"The
\n", + "\n", + "\n", + "![The image shows the Download Manager. The left is a list of filters, where you can choose recommended products, product categories, extensions, and groups. To the left is a file list of all files for the observations. There are columns that label the file size, name, product type, etc.](figures/mastp_cart2.png \"MAST Portal Download Basket\")\n", "\n", "Each dataset contains *many* files, most of which are calibration files or intermediate processing files. You may or may not want some of these intermediate files in addition to the final product file.\n", "In the leftmost \"Filters\" section of the Download Basket page, you can narrow which files will be downloaded (Fig 1.14, boxed in red).\n", @@ -590,7 +619,9 @@ "source": [ "Caution! \n", " \n", - "\"This \n", + "\n", + "\n", + "![This image is a warning symbol.](figures/warning.png \"CAUTION\")\n", "\n", "Please note that these queries are `Astropy` tables and do not always respond as expected for other data structures like `Pandas DataFrames`. For instance, the first way of filtering a table shown below is correct, but the second will consistently produce the *wrong result*. You *must* search and filter these tables by masking them, as in the first example below." ] @@ -645,7 +676,9 @@ "- Support files such as the spacecraft's pointing data over time (`jit` files).\n", "- Intermediate data products such as calibrated TIME-TAG data (`corrtag` or `corrtag_a`/`corrtag_b` files) and extracted 1-dimensional spectra averaged over exposures with a specific `FP-POS` value (`x1dsum` files).\n", "\n", - "\"This\n", + "\n", + "\n", + "![This image is a warning symbol.](figures/warning.png \"CAUTION\")\n", "\n", "However, use caution with downloading all files, as in this case, setting `mrp_only` to `False` results in the transfer of **7 Gigabytes** of data, which can take a long time to transfer and eat away at your computer's storage! In general, only download the files you need. On the other hand, often researchers will download only the raw data, so that they can process it for themselves. Since here we only need the final `x1dsum` and `asn` files, we only need to download 2 Megabytes." ] @@ -894,7 +927,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.10.9" + "version": "3.8.12" }, "vscode": { "interpreter": { diff --git a/notebooks/COS/DayNight/DayNight.ipynb b/notebooks/COS/DayNight/DayNight.ipynb index e7c142ef4..55383cce3 100644 --- a/notebooks/COS/DayNight/DayNight.ipynb +++ b/notebooks/COS/DayNight/DayNight.ipynb @@ -524,7 +524,9 @@ "\n", " Caution!\n", "\n", - " \n", + "\n", + "\n", + "![CAUTION!](figures/warning.png \"CAUTION!\")\n", "\n", "**The process in the following two cells can take a long time and strain network resources!** If you have already downloaded *up-to-date* COS reference files, avoid doing so again.\n", " \n", @@ -868,7 +870,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.9.13" + "version": "3.8.12" } }, "nbformat": 4, diff --git a/notebooks/COS/LSF/LSF.ipynb b/notebooks/COS/LSF/LSF.ipynb index 556def5fe..9e2854ab4 100644 --- a/notebooks/COS/LSF/LSF.ipynb +++ b/notebooks/COS/LSF/LSF.ipynb @@ -298,12 +298,15 @@ "The COS team maintains up-to-date LSF files on the [COS Spectral Resolution page](https://www.stsci.edu/hst/instrumentation/cos/performance/spectral-resolution). Opening up this link leads to a page like that shown in Fig. 1.1, where the LSF files are discussed in detail. The bottom part of this page has links to all the relavent files. The links at the top of the page will take you to the relevant section. In Fig. 1.1, we have circled in black the link to the section pertaining to our data: FUV at the Lifetime Position: 3.\n", "\n", "###
Fig 1.1: Screenshot of the COS Spectral Resolution Site
\n", - "
\n", + "\n", + "![COS Spectral Resolution Site](figures/LSFHomepage.png \"COS Spectral Resolution Site\")\n", "\n", "Clicking on the circled link takes us to the table of hyperlinks to all the files perataining to data taken with the FUV, Lifetime Postition 3 configutation, shown in Fig. 1.2:\n", "\n", "###
Fig 1.2: Screenshot of the COS Spectral Resolution Site - Focus on LP-POS 3
\n", - "
\n", + "\n", + "\n", + "![COS Spectral Resolution Site - Lifetime Position 3](figures/LSFHomepage2.png \"COS Spectral Resolution Site - Lifetime Position 3\")\n", "\n", "Circled in solid red is the button to download the LSF file we need for our data with CENWAVE = 1291. Circled in dashed black is the corresponding CDSF.\n", "\n", @@ -1341,7 +1344,7 @@ " continuum (array or -1) : if -1, default of continuum of 1, \\\n", " otherwise must be same length as emitspec\n", " \"\"\"\n", - " if type(continuum) == int:\n", + " if type(continuum) is int:\n", " if continuum == -1:\n", " continuum = np.ones(len(emitspec))\n", " \n", @@ -2201,7 +2204,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.9.13" + "version": "3.8.12" } }, "nbformat": 4, diff --git a/notebooks/COS/Setup/Setup.ipynb b/notebooks/COS/Setup/Setup.ipynb index 92e327992..74f724ff6 100644 --- a/notebooks/COS/Setup/Setup.ipynb +++ b/notebooks/COS/Setup/Setup.ipynb @@ -230,17 +230,22 @@ "First, we will check the CRDS website to determine what the current context is, as it changes regularly. In your browser, navigate to [the HST CRDS homepage](https://hst-crds.stsci.edu), and you will see a page as in Fig. 3.1:\n", "\n", "### Fig 3.1\n", - "
\"The
\n", + "\n", + "\n", + "![The CRDS homepage. There is a dropdown which lists all Hubble instruments. The table at the bottom of the page lists historical contexts.](figures/crds1.png \"CRDS Homepage\")\n", "\n", "At the bottom of this page is a list of recent contexts, titled \"Context History\". Clicking the context listed with the Status \"Operational\" (circled in red in Fig 3.1) will take you to that context's page, as shown in Fig. 3.2:\n", "\n", "### Fig 3.2\n", - "
\"The
\n", + "\n", "\n", - "By clicking the \"cos\" tab, (circled in red), you will be open up the tab, showing a page similar to Fig. 3.3, where you can find the current COS instrument context file: `hst_cos_.imap`. This filename is circled in red in Fig. 3.3.\n", + "![The CRDS Historical References context page on the CRDS site. Each instrument is listed on the dropdown.\"> \n", + "By clicking the \"cos\" tab, (circled in red), you will be open up the tab, showing a page similar to Fig. 3.3, where you can find the current COS instrument context file: `hst_cos_.imap`. This filename is circled in red in Fig. 3.3.](./figures/crds2.png \"CRDS current 'Historical References' context page\")\n", "\n", "### Fig 3.3\n", - "
\"Showing
\n", + "\n", + "\n", + "![Showing the current COS context on the CRDS site. There is a large list of reference files, and the imap file labeled at the top of this list as hst_cos_0320.imap](figures/crds3.png \"Current COS context on the CRDS site\")\n", "\n", "Note down or copy the filename you just found." ] @@ -435,7 +440,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.10.9" + "version": "3.8.12" } }, "nbformat": 4, diff --git a/notebooks/COS/ViewData/ViewData.ipynb b/notebooks/COS/ViewData/ViewData.ipynb index cec4cc61e..a7597ac87 100644 --- a/notebooks/COS/ViewData/ViewData.ipynb +++ b/notebooks/COS/ViewData/ViewData.ipynb @@ -458,20 +458,27 @@ "\n", "### Fig. 1.1 from [COS DHB Fig. 1.6](https://hst-docs.stsci.edu/cosdhb/chapter-1-cos-overview/1-2-cos-physical-configuration#id-1.2COSPhysicalConfiguration-Figure1.6)\n", "\n", - "
\n", + "\n", + "\n", + "![Layout of the COS FUV detector. Note that FUVB corresponds to shorter wavelengths than FUVA.](figures/cosdhb_fig1p6.jpg \"Layout of the COS FUV detector\")\n", + "\n", + "Layout of the COS FUV detector. Note that FUVB corresponds to shorter wavelengths than FUVA.\n", "\n", "**In the case of the NUV data, we see a similar `astropy` style table of 3 rows** (labeled NUVA, NUVB, and NUVC). These rows contain data from the 3 stripes of the NUV spectrum (see Figure 1.2).\n", "\n", "### Fig. 1.2 from [COS DHB Fig. 1.10](https://hst-docs.stsci.edu/cosdhb/chapter-1-cos-overview/1-2-cos-physical-configuration#id-1.2COSPhysicalConfiguration-Figure1.10)\n", "\n", - "
\n", + "\n", + "\n", + "![An example COS NUV spectrum. The spectrum itself, taken with the Primary Science Aperture, is in the lower three stripes labeled 'PSA'. The upper stripes, labeled 'WCA' are for wavelength calibration.](figures/ch1_cos_overview3.10.jpg \"An example COS NUV spectrum\")\n", "\n", + "An example COS NUV spectrum. The spectrum itself, taken with the Primary Science Aperture, is in the lower three stripes labeled 'PSA'. The upper stripes, labeled 'WCA' are for wavelength calibration.\n", "\n", "An important thing to note about this *NUV* data in particular is that with the grating used here (G230L), stripe C is actually a 2nd order spectrum with a higher dispersion and ~5% contamination from the 1st order spectrum. See the [COS Data Handbook](https://hst-docs.stsci.edu/cosdhb/chapter-1-cos-overview/1-1-instrument-capabilities-and-design#id-1.1InstrumentCapabilitiesandDesign-NUVSpectroscopyNUVSpectroscopy) for more information." ] @@ -1581,7 +1588,10 @@ "\n", "Caution!\n", "\n", - " \n", + "\n", + "\n", + "![CAUTION!](figures/warning.png \"CAUTION!\")\n", + "\n", "This simplification may not hold very well if your source is diffuse or faint." ] }, @@ -2024,7 +2034,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.10.9" + "version": "3.8.12" } }, "nbformat": 4, diff --git a/notebooks/STIS/CoronagraphyViz/STIS_Coronagraphy_Visualization_v2.ipynb b/notebooks/STIS/CoronagraphyViz/STIS_Coronagraphy_Visualization_v2.ipynb index 11178bc4d..570803624 100644 --- a/notebooks/STIS/CoronagraphyViz/STIS_Coronagraphy_Visualization_v2.ipynb +++ b/notebooks/STIS/CoronagraphyViz/STIS_Coronagraphy_Visualization_v2.ipynb @@ -41,7 +41,9 @@ "\n", "To this end, the functions and examples in this notebook are meant to be an illustrative guide to visualizing possible aperture+companion+orientation configurations. \n", "\n", - "\"Positions\n", + "\n", + "\n", + "![Positions of STIS supported coronagraphic apertures, including the two WEDGEs and two BARs.](c12_special12.1.png \"Positions of STIS supported coronagraphic apertures, including the two WEDGEs and two BARs.\")\n", "\n", "Positions of STIS supported coronagraphic apertures, including the two WEDGEs and two BARs. Note that the full STIS field of view is 50\" x 50\"." ] @@ -494,7 +496,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.9.13" + "version": "3.8.12" } }, "nbformat": 4, diff --git a/notebooks/STIS/calstis/calstis_2d_ccd.ipynb b/notebooks/STIS/calstis/calstis_2d_ccd.ipynb index 11eefa55d..bd9b3056b 100644 --- a/notebooks/STIS/calstis/calstis_2d_ccd.ipynb +++ b/notebooks/STIS/calstis/calstis_2d_ccd.ipynb @@ -367,9 +367,11 @@ "The BLEVCORR step is part of basic 2-D image reduction for CCD data only. This step subtracts the electronic bias level for each line of the CCD image and trims the overscan regions off of the input image, leaving only the exposed portions of the image. \n", "\n", "Because the electronic bias level can vary with time and temperature, its value is determined from the overscan region in the particular exposure being processed. This bias is applied equally to real pixels (main detector and physical overscan) and the virtual overscan region (pixels that don't actually exist, but are recorded when the detector clocks out extra times after reading out all the parallel rows). A raw STIS CCD image in full frame unbinned mode has 19 leading and trailing columns of serial physical overscan in the AXIS1 (x direction), and 20 rows of virtual overscan in the AXIS2 (y direction); therefore the size of the uncalibrated and unbinned full framge CCD image is 1062(serial) $\\times$ 1044(parallel) pixels, with 1024 * 1024 exposed science pixels.\n", - "
\n", + "\n", + "\n", + "![Graph illustrating parallel serial overscan corresponding to wavelength in the x-axis and virtual overscan corresponding to position along slit in the y-axis.](figures/CCD_overscan.jpg \"Graph illustrating parallel serial overscan corresponding to wavelength in the x-axis and virtual overscan corresponding to position along slit in the y-axis.\")" ] }, { @@ -380,9 +382,11 @@ "The electronic bias level is subtracted line-by-line. An initial value of electronic bias level is determined for each line of the image using the serial and parallel overscans, and a straight line is fitted to the bias as a function of image line. The intial electronic bias for each line is determined by taking the median of a predetermined subset of the trailing serial overscan pixels, which currently includes most trailing overscan region except the first and last three pixels, and pixels flagged with bad data quality flags. The actual overscan bias subtracted from the image is the value of the linear fit at a specific image line. The mean value of all overscan levels is written to the output SCI extension header as MEANBLEV.\n", "\n", "THE BLEVCORR step also trims the image of overscan. The size of the overscan regions depend on binning and whether the image if full frame or a subimage, and the locations of the overscan regions depend on which amplifier was used for readout. The number of pixels trimmed during CCD bias level correction on each side is given in the following table.\n", - "
\n", + "\n", + "\n", + "![The number of pixels trimmed during CCD bias level correction on each side](figures/pixels_trimmed.jpg)\n" ] }, { @@ -706,7 +710,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.9.13" + "version": "3.8.12" }, "toc": { "base_numbering": "0", diff --git a/notebooks/STIS/drizpac_notebook/STIS_DrizzlePac_Tutorial.ipynb b/notebooks/STIS/drizpac_notebook/STIS_DrizzlePac_Tutorial.ipynb index ddcc568be..531952744 100644 --- a/notebooks/STIS/drizpac_notebook/STIS_DrizzlePac_Tutorial.ipynb +++ b/notebooks/STIS/drizpac_notebook/STIS_DrizzlePac_Tutorial.ipynb @@ -1206,7 +1206,8 @@ "Next, the images for each detector are combined using `astrodrizzle`. The high-level concept behind drizzling images is described in detail in [Section 3.2 of the DrizzlePac Handbook](https://hst-docs.stsci.edu/drizzpac/files/60245881/109774911/1/1642098151920/DrizzlePac_Handbook_v2.pdf). \n", "\n", "Setting the appropriate `final_scale` and `final_pixfrac` parameters for your images takes some thought and testing to avoid gaps in the data. The figure below shows a basic example of the native pixel scale (red squares), shrink factor `final_pixfrac` (blue squares) and output final pixel scale `final_scale` (grid on right) in a drizzle. For more details on the `astrodrizzle` input parameters, see the the [DrizzlePac code webpages](https://drizzlepac.readthedocs.io/en/latest/astrodrizzle.html).\n", - "\"Drizzle\"\n", + "\n", + "![Schematic representation of how drizzle maps input pixels onto the output image](drizzle.png \"Schematic representation of how drizzle maps input pixels onto the output image\")\n", "\n", "`astrodrizzle` can be used to increase the pixel sampling of data if images have dithering and different position angles (PAs). For the STIS data used here, all images are at the same PA and therefore sub-sampling the data is not possible. The example for the STIS data shown below adopts the native pixel scale of each detector as the `final_scale` and no fractional scaling down of each pixel (`final_pixfrac=1.0`) prior to drizzling. Drizzle in this context is a useful tool for creating mosaics of images aligned with `tweakreg`.\n", "\n", @@ -1306,8 +1307,11 @@ "\n", "The CCD images are all observed at a common position angle and RA/Dec with small dithers (hence poorer quality edges in the mosaic below). Sub-sampling the images with `astrodrizzle` is not advisable for these programs as reducing the pixel size results in gaps in the data.\n", "\n", + "\n", + "\n", "Figure shows the CCD drizzle of NGC 5139 (left) and the individual CCD reference image used for alignment (right).\n", - "\"CCD" + "![Comparison of the CCD drizzle of NGC 5139 (left) and the individual CCD reference image used for alignment (right)](ccd_drz.png \"the CCD drizzle of NGC 5139 (left) and the individual CCD reference image used for alignment (right)\")" ] }, { @@ -1378,9 +1382,13 @@ "\n", "The NUV images are all observed at a common position angle with large dithers (hence different image depths in the mosaic below). Sub-sampling the images with `astrodrizzle` is not advisable for these programs as reducing the pixel size results in gaps in the data.\n", "\n", - "Figure shows the NUV drizzle of NGC 6681 (left) and the individual NUV reference image used for alignment (right).\n", + "\n", + "\n", + "Figure shows the NUV drizzle of NGC 6681 (left) and the individual NUV reference image used for alignment (right).\n", + "![NUV drizzle of NGC 6681 (left) and the individual NUV reference image used for alignment (right)](nuv_drz.png \"NUV drizzle of NGC 6681 (left) and the individual NUV reference image used for alignment (right)\")\n", + "\n" ] }, { @@ -1452,7 +1460,8 @@ "The FUV images are all observed at a common position angle with large dithers (hence different image depths in the mosaic below). Sub-sampling the images with `astrodrizzle` is not advisable for these programs as reducing the pixel size results in gaps in the data.\n", "\n", "Figure shows the FUV drizzle of NGC 6681 (left) and the individual FUV reference image used for alignment (right).\n", - "\"FUV" + "\n", + "![Figure shows the FUV drizzle of NGC 6681 (left) and the individual FUV reference image used for alignment (right).](fuv_drz.png \"Figure shows the FUV drizzle of NGC 6681 (left) and the individual FUV reference image used for alignment (right).\")\n" ] }, { @@ -1561,7 +1570,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.9.18" + "version": "3.8.12" }, "toc": { "base_numbering": 1, diff --git a/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb b/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb index efc08eb9e..14d358b22 100644 --- a/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb +++ b/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb @@ -683,7 +683,9 @@ "source": [ "
Animated GIF of the v1.0 and v2.0 FLC image subsections:
\n", "\n", - "\"An\n" + "\n", + "\n", + "![An animated gif blinking between a subsection of background sky using the v1.0 and V2.0 pixel-based CTE corrections. The v2.0 background appears smoother with less noise and pixel variations.](example/v1_v2_bkg.gif \"An animated gif blinking between a subsection of background sky using the v1.0 and V2.0 pixel-based CTE corrections. The v2.0 background appears smoother with less noise and pixel variations.\")\n" ] }, { @@ -771,7 +773,9 @@ "metadata": {}, "source": [ "
Animated GIF of the v1.0 and v2.0 FLC image subsections:
\n", - "" + "\n", + "\n", + "![An animated GIF of the v1.0 and v2.0 FLC image subsections](example/v1_v2_subsection.gif \"Animated GIF of the v1.0 and v2.0 FLC image subsections\")" ] }, { @@ -787,7 +791,8 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "" + "\n", + "![Aperture photometry illustration](example/apphot_image.png \"Aperture photometry illustration\")" ] }, { @@ -989,7 +994,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.11.0" + "version": "3.8.12" }, "varInspector": { "cols": { From 2d4d540261fa37150f77e07d39425720a5d2a33b Mon Sep 17 00:00:00 2001 From: Michael Dulude Date: Mon, 4 Dec 2023 11:04:26 -0500 Subject: [PATCH 11/30] Update _config.yml --- _config.yml | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/_config.yml b/_config.yml index 0008a5d0f..8cd1c3728 100644 --- a/_config.yml +++ b/_config.yml @@ -24,6 +24,12 @@ latex: bibtex_bibfiles: - references.bib +parse: + myst_enable_extensions: + # don't forget to list any other extensions you want enabled, + # including those that are enabled by default! See here: https://jupyterbook.org/en/stable/customize/config.html + - html_image + launch_buttons: thebe : true colab : true From e743e3f0694a8fd95f3bc98eb050558bb7948c23 Mon Sep 17 00:00:00 2001 From: Michael Dulude Date: Mon, 4 Dec 2023 11:30:50 -0500 Subject: [PATCH 12/30] Revert "Updated syntax used to display images so they properly render in generated HTML pages (#149)" This reverts commit 395bb050fb1be579eefd44b4a827c26d7d50c526. --- notebooks/COS/AsnFile/AsnFile.ipynb | 8 +-- notebooks/COS/CalCOS/CalCOS.ipynb | 6 +- notebooks/COS/DataDl/DataDl.ipynb | 69 +++++-------------- notebooks/COS/DayNight/DayNight.ipynb | 6 +- notebooks/COS/LSF/LSF.ipynb | 11 ++- notebooks/COS/Setup/Setup.ipynb | 15 ++-- notebooks/COS/ViewData/ViewData.ipynb | 22 ++---- .../STIS_Coronagraphy_Visualization_v2.ipynb | 6 +- notebooks/STIS/calstis/calstis_2d_ccd.ipynb | 14 ++-- .../STIS_DrizzlePac_Tutorial.ipynb | 21 ++---- .../calwf3_with_v1.0_PCTE.ipynb | 13 ++-- 11 files changed, 57 insertions(+), 134 deletions(-) diff --git a/notebooks/COS/AsnFile/AsnFile.ipynb b/notebooks/COS/AsnFile/AsnFile.ipynb index fbdf4fffa..2c6b04e53 100644 --- a/notebooks/COS/AsnFile/AsnFile.ipynb +++ b/notebooks/COS/AsnFile/AsnFile.ipynb @@ -307,9 +307,7 @@ "\n", "### Figure 1. Comparison of fluxes between the data retrieved from MAST, and the same data reprocessed after removing the bad exposure\n", "\n", - "\n", - "\n", - "![A comparison between the fluxes of our data. One dataset if without our bad exposure removed, and the other is with our bad exposure removed. Both plots are relatively the same, as they should be.](./figures/compare_fluxes_after_removing_badfile.png \"Comparison of fluxes\")" + "A comparison between the fluxes of our data. One dataset if without our bad exposure removed, and the other is with our bad exposure removed. Both plots are relatively the same, as they should be." ] }, { @@ -907,7 +905,7 @@ " \"Exposure_type\": rawtag_a_exptypes,\n", " # Date in MJD\n", " \"Exposure_start_date\": rawtag_a_expstart_times,\n", - " \"Seconds_since_first_exposure\": \\\n", + " \"Seconds_since_first_exposure\":\\\n", " # Convert time since the first exposure into seconds\n", " 86400*np.subtract(rawtag_a_expstart_times, min(rawtag_a_expstart_times))\n", "})" @@ -1114,7 +1112,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.8.12" + "version": "3.9.13" }, "nbdime-conflicts": { "local_diff": [ diff --git a/notebooks/COS/CalCOS/CalCOS.ipynb b/notebooks/COS/CalCOS/CalCOS.ipynb index 9ec563ecd..5ffb8dc23 100644 --- a/notebooks/COS/CalCOS/CalCOS.ipynb +++ b/notebooks/COS/CalCOS/CalCOS.ipynb @@ -420,9 +420,7 @@ "\n", "Caution!\n", " \n", - "\n", - "\n", - "![A warning symbol. Watch out!](./figures/warning.png \"CAUTION!\")\n", + "\"A \n", "\n", "*Note* that as of the time of this Notebook's update, the pipeline context used below was **`hst_1071.pmap`**, but this changes over time. You are running this in the future, and there is certainly a newer context you would be better off working with. Take a minute to consider this, and check the [HST Calibration Reference Data System webpage](http://hst-crds.stsci.edu/) to determine what the **current operational pmap file** is. " ] @@ -1098,7 +1096,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.8.12" + "version": "3.9.13" } }, "nbformat": 4, diff --git a/notebooks/COS/DataDl/DataDl.ipynb b/notebooks/COS/DataDl/DataDl.ipynb index 1afce398c..f4f2455fd 100644 --- a/notebooks/COS/DataDl/DataDl.ipynb +++ b/notebooks/COS/DataDl/DataDl.ipynb @@ -193,9 +193,7 @@ "source": [ "The search page of the HST-specific interface is laid out as in Figure 1.1:\n", "### Fig 1.1\n", - "\n", - "\n", - "![An image of the MAST search site. On the page there are boxes for each different criteria that you want in your search, such as Object Name in the top left, active instruments below that, and at the bottom of the page there is a box that you can fill with a certain column name and your desired criteria (if the criteria is not already on the page). At the top right is the 'My ST' button.](./figures/new_hst_search_login.png \"New HST-specific search website\")\n", + "
\"An
\n", "\n", "If you are accessing proprietary data, you will need to make an account or log in at the top right under \"MY ST\" (Fig 1.1, boxed in red). If you are accessing non-proprietary data, you may continue without logging in.\n" ] @@ -211,9 +209,7 @@ "* Are taken with the COS instrument, using the G160M grating and either the 1533 or the 1577 cenwave setting\n", "\n", "### Fig 1.2\n", - "\n", - "\n", - "![New HST-specific website search filled out with a COS data query, boxes with each criteria from above are highlighted with a red box. The central wavelength condition was added to the bottom box where you choose columns for criteria, since the central wavelength is not a pre-given criteria.](figures/hst_search_query_updated.png \"New HST-specific website search filled out with a COS data query\")" + "
\"\"
" ] }, { @@ -223,9 +219,7 @@ "The above search results in the table shown in Figure 1.3. \n", "\n", "### Fig 1.3\n", - "\n", - "\n", - "![The page is now the search results page. At the top left are the 'Edit Search' box highlighted in dashed red. Below that is the 'Download Dataset' button highlighted with a green circle. Under that is the list of datasets, with each row beginning with a empty checkbox, followed by information about the dataset, such as search position, dataset name, target name, etc.](figures/new_hst_search_results.png \"Results from new HST-specific search website query\")\n", + "
\"The
\n", "\n", "If you need to change some parameters in your search - for instance, to also find data from the G130M grating - click on \"Edit Search\" (Fig 1.3, red dashed box).\n", "\n", @@ -240,9 +234,7 @@ "Most COS spectra have preview images (simple plots of flux by wavelength) which can be viewed before downloading the data. Clicking the dataset name (Fig 1.3, blue dashed oval) will take you to a page which shows the preview image, as well as some basic information about the data and whether there were any known failures during the operation. An example of such a page is shown in Figure 1.4.\n", "\n", "### Fig 1.4\n", - "\n", - "\n", - "![The page is now an image of a spectrum preview for dataset LPXK51020. It shows two plots of wavelength vs flux. Below is a list of exposure information, such as observation data, exposure time, release date, mode. At the very bottom of the page is the proposal ID along with the PI and CoIs.](figures/preview_spectrum_small.png \"Preview spectrum page\")" + "
\"The
" ] }, { @@ -252,15 +244,10 @@ "Returning to the results page shown in Fig 1.3 and clicking \"Download Data\" opens a window as shown in Figure 1.5. In this window you can search for filetypes using the search bar, and unselect/select all the data products shown in the filtered list (Fig 1.5, green circle). Clicking the \"Showing all filetypes\" box (Figure 1.5, red box) shows a drop-down (Figure 1.6) where you can choose to show/hide certain types of data such as the uncalibrated data. \n", "\n", "### Fig 1.5\n", - "\n", - "\n", - "![The image shows the a table of files in dataset LDJ1010. Each row shows different CalCOS products, such as the ASN file, X1DSUM, JIT, etc. The title of the table is the dataset name, and to the left of the name is the button to check all or none exposures, highlighted with a green circle. There are columns on the table that give information about the insturment used (COS) and the filter/grating (G140L). Above the table is a dropdown that allows you to choose the filetypes you wish to download.](figures/new_hst_search_downloading_all_files.png \"Choosing what to download in the new HST search website\")\n", + "
\"The
\n", "\n", "### Fig 1.6\n", - "\n", - "\n", - "![The image shows the same page as Figure 1.5, but the 'Showing all file types' dropdown button has been clicked. There are rows that have checkboxes on the left, each row from top to bottom is labled 'Calibrated', 'Uncalibrated', 'Aux/Other', 'Reference', 'Log/Jitter'](figures/new_hst_search_downloading_filetypes.png \"Choosing which data to download (calibrated/uncalibrated/etc.\")\n", + "
\"The
\n", "\n", "When all of your desired data products are checked, click \"Start Download\" (Fig 1.5, yellow dashed box). This will download a compressed \"zipped\" folder of all of your data, divided into subdirectories by the observation. Most operating systems can decompress these folders by default. For help decompressing the zipped files, you can follow these links for: [Windows](https://support.microsoft.com/en-us/windows/zip-and-unzip-files-8d28fa72-f2f9-712f-67df-f80cf89fd4e5) and [Mac](https://support.apple.com/guide/mac-help/zip-and-unzip-files-and-folders-on-mac-mchlp2528/mac). There are numerous ways to do this on Linux, however we have not vetted them." ] @@ -277,9 +264,7 @@ "\n", "### Fig 1.7\n", "\n", - "\n", - "![The image is showing the search page for MAST again. Halfway down the page is a box to input dataset name criteria. Above that is the obervation type box, which is composed of three check boxes. These boxes from left to right are 'All', 'Science', and 'Calibration'. The science box is the only one that is checked.](figures/new_hst_search_query2_small.png \"New HST-specific website search filled out with a specific dataset ID\")" + "
\"The
" ] }, { @@ -334,7 +319,7 @@ }, { "cell_type": "code", - "execution_count": 1, + "execution_count": null, "metadata": {}, "outputs": [], "source": [ @@ -362,9 +347,7 @@ "Because we are searching by Dataset ID, we don't need to specify any additional parameters to narrow down the data.\n", "\n", "### Fig 1.8\n", - "\n", - "\n", - "![The image shows the MAST search page, with the top box filled in. The title of this box is 'Object name(s) and/or RA and Dec pair(s). The filled in box is the text file obsId_list.txt. The observations box below it has only the science box checked off.](figures/new_search_file_list_small.png \"File Upload Search Form\")\n" + "
\"The
\n" ] }, { @@ -374,9 +357,7 @@ "We now can access all the datasets specified in `obsId_list.txt`, as shown in Figure 1.9:\n", "\n", "### Fig 1.9\n", - "\n", - "\n", - "![The image shows the MAST search results page, with the three datasets from our text file shows in each row.](figures/new_search_file_list_res_small.png \"Upload List of Objects Search Results\")\n", + "
\"The
\n", "\n", "We can select and download their data products as before." ] @@ -394,15 +375,11 @@ "\n", "Navigate to the MAST Portal at , and you will be greeted by a screen where the top looks like Figure 1.10. \n", "### Fig 1.10\n", - "\n", - "\n", - "![The image shows the MAST portal, specifically the search box at the top.](figures/mastp_top.png \"Top of MAST Portal Home\")\n", + "
\"The
\n", "\n", "Click on \"Advanced Search\" (boxed in red in Figure 1.10). This will open up a new search tab, as shown in Figure 1.11:\n", "### Fig 1.11\n", - "\n", - "\n", - "![The image shows the advanced search pop-up. On the left of the pop-up is a list of check boxes for different search criteria, such as Target name, instrument, mission, etc. To the left of that are larger boxes for different critera, and inside these mobes are rows with checkboxes. For example, there is a larger box labeled 'Observation type' with two checkbox rows labeled 'science' and 'calibration'.](figures/mastp_adv.png \"The advanced search tab\")\n", + "
\"The
\n", "\n", "Fig 1.11 (above) shows the default search fields which appear. Depending on what you are looking for, these may or may not be the most helpful search fields. By unchecking some of the fields which we are not interested in searching by right now (Figure 1.12, boxed in green), and then entering the parameter values by which to narrow the search into each parameter's box, we generate Fig 1.12. One of the six fields (Mission) by which we are narrowing is boxed in a dashed blue line. The list of applied filters is boxed in red. A dashed pink box at the top left indicates that 2 records were found matching all of these parameters. To its left is an orange box around the \"Search\" button to press to bring up the list of results.\n", "\n", @@ -418,9 +395,7 @@ "|Product Type|spectrum|\n", "\n", "### Fig 1.12\n", - "\n", - "\n", - "![The image shows the same advanced search pop-up, filled out with our desired criteria. The criteria is highlighted, and above it is a list of the applied filters highlighted in red. The top of the pop-up has a 'Search' button, and also lists the records found. There are two records for our search.](figures/mastp_adv_2.png \"The advanced search tab with some selections\")\n", + "
\"The
\n", "\n" ] }, @@ -431,9 +406,7 @@ "Click the \"Search\" button (boxed in orange), and you will be brought to a page resembling Figure 1.13. \n", "\n", "### Fig 1.13\n", - "\n", - "\n", - "![The image shows the search results list, with consists of two rows for our search. To the left of the rows are checkboxes, then there is are images of a disk, spectra, and three dots. The rest of the columns of the rows show some criteria, such as the mission, observation type, etc. To the far right is the image of our object.](figures/mastp_res1.png \"Results of MAST Portal search\")" + "
\"The
" ] }, { @@ -470,9 +443,7 @@ "metadata": {}, "source": [ "### Fig 1.14\n", - "\n", - "\n", - "![The image shows the Download Manager. The left is a list of filters, where you can choose recommended products, product categories, extensions, and groups. To the left is a file list of all files for the observations. There are columns that label the file size, name, product type, etc.](figures/mastp_cart2.png \"MAST Portal Download Basket\")\n", + "
\"The
\n", "\n", "Each dataset contains *many* files, most of which are calibration files or intermediate processing files. You may or may not want some of these intermediate files in addition to the final product file.\n", "In the leftmost \"Filters\" section of the Download Basket page, you can narrow which files will be downloaded (Fig 1.14, boxed in red).\n", @@ -619,9 +590,7 @@ "source": [ "Caution! \n", " \n", - "\n", - "\n", - "![This image is a warning symbol.](figures/warning.png \"CAUTION\")\n", + "\"This \n", "\n", "Please note that these queries are `Astropy` tables and do not always respond as expected for other data structures like `Pandas DataFrames`. For instance, the first way of filtering a table shown below is correct, but the second will consistently produce the *wrong result*. You *must* search and filter these tables by masking them, as in the first example below." ] @@ -676,9 +645,7 @@ "- Support files such as the spacecraft's pointing data over time (`jit` files).\n", "- Intermediate data products such as calibrated TIME-TAG data (`corrtag` or `corrtag_a`/`corrtag_b` files) and extracted 1-dimensional spectra averaged over exposures with a specific `FP-POS` value (`x1dsum` files).\n", "\n", - "\n", - "\n", - "![This image is a warning symbol.](figures/warning.png \"CAUTION\")\n", + "\"This\n", "\n", "However, use caution with downloading all files, as in this case, setting `mrp_only` to `False` results in the transfer of **7 Gigabytes** of data, which can take a long time to transfer and eat away at your computer's storage! In general, only download the files you need. On the other hand, often researchers will download only the raw data, so that they can process it for themselves. Since here we only need the final `x1dsum` and `asn` files, we only need to download 2 Megabytes." ] @@ -927,7 +894,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.8.12" + "version": "3.10.9" }, "vscode": { "interpreter": { diff --git a/notebooks/COS/DayNight/DayNight.ipynb b/notebooks/COS/DayNight/DayNight.ipynb index 55383cce3..e7c142ef4 100644 --- a/notebooks/COS/DayNight/DayNight.ipynb +++ b/notebooks/COS/DayNight/DayNight.ipynb @@ -524,9 +524,7 @@ "\n", " Caution!\n", "\n", - "\n", - "\n", - "![CAUTION!](figures/warning.png \"CAUTION!\")\n", + " \n", "\n", "**The process in the following two cells can take a long time and strain network resources!** If you have already downloaded *up-to-date* COS reference files, avoid doing so again.\n", " \n", @@ -870,7 +868,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.8.12" + "version": "3.9.13" } }, "nbformat": 4, diff --git a/notebooks/COS/LSF/LSF.ipynb b/notebooks/COS/LSF/LSF.ipynb index 9e2854ab4..556def5fe 100644 --- a/notebooks/COS/LSF/LSF.ipynb +++ b/notebooks/COS/LSF/LSF.ipynb @@ -298,15 +298,12 @@ "The COS team maintains up-to-date LSF files on the [COS Spectral Resolution page](https://www.stsci.edu/hst/instrumentation/cos/performance/spectral-resolution). Opening up this link leads to a page like that shown in Fig. 1.1, where the LSF files are discussed in detail. The bottom part of this page has links to all the relavent files. The links at the top of the page will take you to the relevant section. In Fig. 1.1, we have circled in black the link to the section pertaining to our data: FUV at the Lifetime Position: 3.\n", "\n", "###
Fig 1.1: Screenshot of the COS Spectral Resolution Site
\n", - "\n", - "![COS Spectral Resolution Site](figures/LSFHomepage.png \"COS Spectral Resolution Site\")\n", + "
\n", "\n", "Clicking on the circled link takes us to the table of hyperlinks to all the files perataining to data taken with the FUV, Lifetime Postition 3 configutation, shown in Fig. 1.2:\n", "\n", "###
Fig 1.2: Screenshot of the COS Spectral Resolution Site - Focus on LP-POS 3
\n", - "\n", - "\n", - "![COS Spectral Resolution Site - Lifetime Position 3](figures/LSFHomepage2.png \"COS Spectral Resolution Site - Lifetime Position 3\")\n", + "
\n", "\n", "Circled in solid red is the button to download the LSF file we need for our data with CENWAVE = 1291. Circled in dashed black is the corresponding CDSF.\n", "\n", @@ -1344,7 +1341,7 @@ " continuum (array or -1) : if -1, default of continuum of 1, \\\n", " otherwise must be same length as emitspec\n", " \"\"\"\n", - " if type(continuum) is int:\n", + " if type(continuum) == int:\n", " if continuum == -1:\n", " continuum = np.ones(len(emitspec))\n", " \n", @@ -2204,7 +2201,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.8.12" + "version": "3.9.13" } }, "nbformat": 4, diff --git a/notebooks/COS/Setup/Setup.ipynb b/notebooks/COS/Setup/Setup.ipynb index 74f724ff6..92e327992 100644 --- a/notebooks/COS/Setup/Setup.ipynb +++ b/notebooks/COS/Setup/Setup.ipynb @@ -230,22 +230,17 @@ "First, we will check the CRDS website to determine what the current context is, as it changes regularly. In your browser, navigate to [the HST CRDS homepage](https://hst-crds.stsci.edu), and you will see a page as in Fig. 3.1:\n", "\n", "### Fig 3.1\n", - "\n", - "\n", - "![The CRDS homepage. There is a dropdown which lists all Hubble instruments. The table at the bottom of the page lists historical contexts.](figures/crds1.png \"CRDS Homepage\")\n", + "
\"The
\n", "\n", "At the bottom of this page is a list of recent contexts, titled \"Context History\". Clicking the context listed with the Status \"Operational\" (circled in red in Fig 3.1) will take you to that context's page, as shown in Fig. 3.2:\n", "\n", "### Fig 3.2\n", - "\n", + "
\"The
\n", "\n", - "![The CRDS Historical References context page on the CRDS site. Each instrument is listed on the dropdown.\">
\n", - "By clicking the \"cos\" tab, (circled in red), you will be open up the tab, showing a page similar to Fig. 3.3, where you can find the current COS instrument context file: `hst_cos_.imap`. This filename is circled in red in Fig. 3.3.](./figures/crds2.png \"CRDS current 'Historical References' context page\")\n", + "By clicking the \"cos\" tab, (circled in red), you will be open up the tab, showing a page similar to Fig. 3.3, where you can find the current COS instrument context file: `hst_cos_.imap`. This filename is circled in red in Fig. 3.3.\n", "\n", "### Fig 3.3\n", - "\n", - "\n", - "![Showing the current COS context on the CRDS site. There is a large list of reference files, and the imap file labeled at the top of this list as hst_cos_0320.imap](figures/crds3.png \"Current COS context on the CRDS site\")\n", + "
\"Showing
\n", "\n", "Note down or copy the filename you just found." ] @@ -440,7 +435,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.8.12" + "version": "3.10.9" } }, "nbformat": 4, diff --git a/notebooks/COS/ViewData/ViewData.ipynb b/notebooks/COS/ViewData/ViewData.ipynb index a7597ac87..cec4cc61e 100644 --- a/notebooks/COS/ViewData/ViewData.ipynb +++ b/notebooks/COS/ViewData/ViewData.ipynb @@ -458,27 +458,20 @@ "\n", "### Fig. 1.1 from [COS DHB Fig. 1.6](https://hst-docs.stsci.edu/cosdhb/chapter-1-cos-overview/1-2-cos-physical-configuration#id-1.2COSPhysicalConfiguration-Figure1.6)\n", "\n", - "\n", - "\n", - "![Layout of the COS FUV detector. Note that FUVB corresponds to shorter wavelengths than FUVA.](figures/cosdhb_fig1p6.jpg \"Layout of the COS FUV detector\")\n", - "\n", - "Layout of the COS FUV detector. Note that FUVB corresponds to shorter wavelengths than FUVA.\n", + "
\n", "\n", "**In the case of the NUV data, we see a similar `astropy` style table of 3 rows** (labeled NUVA, NUVB, and NUVC). These rows contain data from the 3 stripes of the NUV spectrum (see Figure 1.2).\n", "\n", "### Fig. 1.2 from [COS DHB Fig. 1.10](https://hst-docs.stsci.edu/cosdhb/chapter-1-cos-overview/1-2-cos-physical-configuration#id-1.2COSPhysicalConfiguration-Figure1.10)\n", "\n", - "\n", - "\n", - "![An example COS NUV spectrum. The spectrum itself, taken with the Primary Science Aperture, is in the lower three stripes labeled 'PSA'. The upper stripes, labeled 'WCA' are for wavelength calibration.](figures/ch1_cos_overview3.10.jpg \"An example COS NUV spectrum\")\n", + "
\n", "\n", - "An example COS NUV spectrum. The spectrum itself, taken with the Primary Science Aperture, is in the lower three stripes labeled 'PSA'. The upper stripes, labeled 'WCA' are for wavelength calibration.\n", "\n", "An important thing to note about this *NUV* data in particular is that with the grating used here (G230L), stripe C is actually a 2nd order spectrum with a higher dispersion and ~5% contamination from the 1st order spectrum. See the [COS Data Handbook](https://hst-docs.stsci.edu/cosdhb/chapter-1-cos-overview/1-1-instrument-capabilities-and-design#id-1.1InstrumentCapabilitiesandDesign-NUVSpectroscopyNUVSpectroscopy) for more information." ] @@ -1588,10 +1581,7 @@ "\n", "Caution!\n", "\n", - "\n", - "\n", - "![CAUTION!](figures/warning.png \"CAUTION!\")\n", - "\n", + " \n", "This simplification may not hold very well if your source is diffuse or faint." ] }, @@ -2034,7 +2024,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.8.12" + "version": "3.10.9" } }, "nbformat": 4, diff --git a/notebooks/STIS/CoronagraphyViz/STIS_Coronagraphy_Visualization_v2.ipynb b/notebooks/STIS/CoronagraphyViz/STIS_Coronagraphy_Visualization_v2.ipynb index 570803624..11178bc4d 100644 --- a/notebooks/STIS/CoronagraphyViz/STIS_Coronagraphy_Visualization_v2.ipynb +++ b/notebooks/STIS/CoronagraphyViz/STIS_Coronagraphy_Visualization_v2.ipynb @@ -41,9 +41,7 @@ "\n", "To this end, the functions and examples in this notebook are meant to be an illustrative guide to visualizing possible aperture+companion+orientation configurations. \n", "\n", - "\n", - "\n", - "![Positions of STIS supported coronagraphic apertures, including the two WEDGEs and two BARs.](c12_special12.1.png \"Positions of STIS supported coronagraphic apertures, including the two WEDGEs and two BARs.\")\n", + "\"Positions\n", "\n", "Positions of STIS supported coronagraphic apertures, including the two WEDGEs and two BARs. Note that the full STIS field of view is 50\" x 50\"." ] @@ -496,7 +494,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.8.12" + "version": "3.9.13" } }, "nbformat": 4, diff --git a/notebooks/STIS/calstis/calstis_2d_ccd.ipynb b/notebooks/STIS/calstis/calstis_2d_ccd.ipynb index bd9b3056b..11eefa55d 100644 --- a/notebooks/STIS/calstis/calstis_2d_ccd.ipynb +++ b/notebooks/STIS/calstis/calstis_2d_ccd.ipynb @@ -367,11 +367,9 @@ "The BLEVCORR step is part of basic 2-D image reduction for CCD data only. This step subtracts the electronic bias level for each line of the CCD image and trims the overscan regions off of the input image, leaving only the exposed portions of the image. \n", "\n", "Because the electronic bias level can vary with time and temperature, its value is determined from the overscan region in the particular exposure being processed. This bias is applied equally to real pixels (main detector and physical overscan) and the virtual overscan region (pixels that don't actually exist, but are recorded when the detector clocks out extra times after reading out all the parallel rows). A raw STIS CCD image in full frame unbinned mode has 19 leading and trailing columns of serial physical overscan in the AXIS1 (x direction), and 20 rows of virtual overscan in the AXIS2 (y direction); therefore the size of the uncalibrated and unbinned full framge CCD image is 1062(serial) $\\times$ 1044(parallel) pixels, with 1024 * 1024 exposed science pixels.\n", - "\n", - "\n", - "![Graph illustrating parallel serial overscan corresponding to wavelength in the x-axis and virtual overscan corresponding to position along slit in the y-axis.](figures/CCD_overscan.jpg \"Graph illustrating parallel serial overscan corresponding to wavelength in the x-axis and virtual overscan corresponding to position along slit in the y-axis.\")" + "
" ] }, { @@ -382,11 +380,9 @@ "The electronic bias level is subtracted line-by-line. An initial value of electronic bias level is determined for each line of the image using the serial and parallel overscans, and a straight line is fitted to the bias as a function of image line. The intial electronic bias for each line is determined by taking the median of a predetermined subset of the trailing serial overscan pixels, which currently includes most trailing overscan region except the first and last three pixels, and pixels flagged with bad data quality flags. The actual overscan bias subtracted from the image is the value of the linear fit at a specific image line. The mean value of all overscan levels is written to the output SCI extension header as MEANBLEV.\n", "\n", "THE BLEVCORR step also trims the image of overscan. The size of the overscan regions depend on binning and whether the image if full frame or a subimage, and the locations of the overscan regions depend on which amplifier was used for readout. The number of pixels trimmed during CCD bias level correction on each side is given in the following table.\n", - "\n", - "\n", - "![The number of pixels trimmed during CCD bias level correction on each side](figures/pixels_trimmed.jpg)\n" + "" ] }, { @@ -710,7 +706,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.8.12" + "version": "3.9.13" }, "toc": { "base_numbering": "0", diff --git a/notebooks/STIS/drizpac_notebook/STIS_DrizzlePac_Tutorial.ipynb b/notebooks/STIS/drizpac_notebook/STIS_DrizzlePac_Tutorial.ipynb index 531952744..ddcc568be 100644 --- a/notebooks/STIS/drizpac_notebook/STIS_DrizzlePac_Tutorial.ipynb +++ b/notebooks/STIS/drizpac_notebook/STIS_DrizzlePac_Tutorial.ipynb @@ -1206,8 +1206,7 @@ "Next, the images for each detector are combined using `astrodrizzle`. The high-level concept behind drizzling images is described in detail in [Section 3.2 of the DrizzlePac Handbook](https://hst-docs.stsci.edu/drizzpac/files/60245881/109774911/1/1642098151920/DrizzlePac_Handbook_v2.pdf). \n", "\n", "Setting the appropriate `final_scale` and `final_pixfrac` parameters for your images takes some thought and testing to avoid gaps in the data. The figure below shows a basic example of the native pixel scale (red squares), shrink factor `final_pixfrac` (blue squares) and output final pixel scale `final_scale` (grid on right) in a drizzle. For more details on the `astrodrizzle` input parameters, see the the [DrizzlePac code webpages](https://drizzlepac.readthedocs.io/en/latest/astrodrizzle.html).\n", - "\n", - "![Schematic representation of how drizzle maps input pixels onto the output image](drizzle.png \"Schematic representation of how drizzle maps input pixels onto the output image\")\n", + "\"Drizzle\"\n", "\n", "`astrodrizzle` can be used to increase the pixel sampling of data if images have dithering and different position angles (PAs). For the STIS data used here, all images are at the same PA and therefore sub-sampling the data is not possible. The example for the STIS data shown below adopts the native pixel scale of each detector as the `final_scale` and no fractional scaling down of each pixel (`final_pixfrac=1.0`) prior to drizzling. Drizzle in this context is a useful tool for creating mosaics of images aligned with `tweakreg`.\n", "\n", @@ -1307,11 +1306,8 @@ "\n", "The CCD images are all observed at a common position angle and RA/Dec with small dithers (hence poorer quality edges in the mosaic below). Sub-sampling the images with `astrodrizzle` is not advisable for these programs as reducing the pixel size results in gaps in the data.\n", "\n", - "\n", - "\n", "Figure shows the CCD drizzle of NGC 5139 (left) and the individual CCD reference image used for alignment (right).\n", - "![Comparison of the CCD drizzle of NGC 5139 (left) and the individual CCD reference image used for alignment (right)](ccd_drz.png \"the CCD drizzle of NGC 5139 (left) and the individual CCD reference image used for alignment (right)\")" + "\"CCD" ] }, { @@ -1382,13 +1378,9 @@ "\n", "The NUV images are all observed at a common position angle with large dithers (hence different image depths in the mosaic below). Sub-sampling the images with `astrodrizzle` is not advisable for these programs as reducing the pixel size results in gaps in the data.\n", "\n", - "\n", - "\n", "Figure shows the NUV drizzle of NGC 6681 (left) and the individual NUV reference image used for alignment (right).\n", - "![NUV drizzle of NGC 6681 (left) and the individual NUV reference image used for alignment (right)](nuv_drz.png \"NUV drizzle of NGC 6681 (left) and the individual NUV reference image used for alignment (right)\")\n", - "\n" + "\n", + "\"NUV" ] }, { @@ -1460,8 +1452,7 @@ "The FUV images are all observed at a common position angle with large dithers (hence different image depths in the mosaic below). Sub-sampling the images with `astrodrizzle` is not advisable for these programs as reducing the pixel size results in gaps in the data.\n", "\n", "Figure shows the FUV drizzle of NGC 6681 (left) and the individual FUV reference image used for alignment (right).\n", - "\n", - "![Figure shows the FUV drizzle of NGC 6681 (left) and the individual FUV reference image used for alignment (right).](fuv_drz.png \"Figure shows the FUV drizzle of NGC 6681 (left) and the individual FUV reference image used for alignment (right).\")\n" + "\"FUV" ] }, { @@ -1570,7 +1561,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.8.12" + "version": "3.9.18" }, "toc": { "base_numbering": 1, diff --git a/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb b/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb index 14d358b22..efc08eb9e 100644 --- a/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb +++ b/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb @@ -683,9 +683,7 @@ "source": [ "
Animated GIF of the v1.0 and v2.0 FLC image subsections:
\n", "\n", - "\n", - "\n", - "![An animated gif blinking between a subsection of background sky using the v1.0 and V2.0 pixel-based CTE corrections. The v2.0 background appears smoother with less noise and pixel variations.](example/v1_v2_bkg.gif \"An animated gif blinking between a subsection of background sky using the v1.0 and V2.0 pixel-based CTE corrections. The v2.0 background appears smoother with less noise and pixel variations.\")\n" + "\"An\n" ] }, { @@ -773,9 +771,7 @@ "metadata": {}, "source": [ "
Animated GIF of the v1.0 and v2.0 FLC image subsections:
\n", - "\n", - "\n", - "![An animated GIF of the v1.0 and v2.0 FLC image subsections](example/v1_v2_subsection.gif \"Animated GIF of the v1.0 and v2.0 FLC image subsections\")" + "" ] }, { @@ -791,8 +787,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "\n", - "![Aperture photometry illustration](example/apphot_image.png \"Aperture photometry illustration\")" + "" ] }, { @@ -994,7 +989,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.8.12" + "version": "3.11.0" }, "varInspector": { "cols": { From e464475d87aae3dec02835622be33f5914ed45d4 Mon Sep 17 00:00:00 2001 From: dulude Date: Mon, 4 Dec 2023 13:02:48 -0500 Subject: [PATCH 13/30] calstis_2d_ccd.ipynb: commented out second image display text and removed centering --- notebooks/STIS/calstis/calstis_2d_ccd.ipynb | 8 +++++--- 1 file changed, 5 insertions(+), 3 deletions(-) diff --git a/notebooks/STIS/calstis/calstis_2d_ccd.ipynb b/notebooks/STIS/calstis/calstis_2d_ccd.ipynb index 11eefa55d..37a533a71 100644 --- a/notebooks/STIS/calstis/calstis_2d_ccd.ipynb +++ b/notebooks/STIS/calstis/calstis_2d_ccd.ipynb @@ -367,9 +367,11 @@ "The BLEVCORR step is part of basic 2-D image reduction for CCD data only. This step subtracts the electronic bias level for each line of the CCD image and trims the overscan regions off of the input image, leaving only the exposed portions of the image. \n", "\n", "Because the electronic bias level can vary with time and temperature, its value is determined from the overscan region in the particular exposure being processed. This bias is applied equally to real pixels (main detector and physical overscan) and the virtual overscan region (pixels that don't actually exist, but are recorded when the detector clocks out extra times after reading out all the parallel rows). A raw STIS CCD image in full frame unbinned mode has 19 leading and trailing columns of serial physical overscan in the AXIS1 (x direction), and 20 rows of virtual overscan in the AXIS2 (y direction); therefore the size of the uncalibrated and unbinned full framge CCD image is 1062(serial) $\\times$ 1044(parallel) pixels, with 1024 * 1024 exposed science pixels.\n", - "
\n", + "\n", + "\n", + "\"Graph" ] }, { @@ -706,7 +708,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.9.13" + "version": "3.8.12" }, "toc": { "base_numbering": "0", From 11c14aefa854415d85eb46408cf850d584f431ab Mon Sep 17 00:00:00 2001 From: dulude Date: Mon, 4 Dec 2023 13:14:35 -0500 Subject: [PATCH 14/30] calstis_2d_ccd.ipynb: trying another way --- notebooks/STIS/calstis/calstis_2d_ccd.ipynb | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/notebooks/STIS/calstis/calstis_2d_ccd.ipynb b/notebooks/STIS/calstis/calstis_2d_ccd.ipynb index 37a533a71..650fc891d 100644 --- a/notebooks/STIS/calstis/calstis_2d_ccd.ipynb +++ b/notebooks/STIS/calstis/calstis_2d_ccd.ipynb @@ -371,7 +371,9 @@ " \"Graph\n", "
-->\n", "\n", - "\"Graph" + "\"Graph\n", + "\n", + "![Graph illustrating parallel serial overscan corresponding to wavelength in the x-axis and virtual overscan corresponding to position along slit in the y-axis.](figures/CCD_overscan.jpg)" ] }, { From 8f1edf52c3ffe4175cf1dda3dd497059283bc36c Mon Sep 17 00:00:00 2001 From: dulude Date: Mon, 4 Dec 2023 13:45:27 -0500 Subject: [PATCH 15/30] calstis_2d_ccd.ipynb: trying another way --- notebooks/STIS/calstis/calstis_2d_ccd.ipynb | 15 ++++++++++++--- 1 file changed, 12 insertions(+), 3 deletions(-) diff --git a/notebooks/STIS/calstis/calstis_2d_ccd.ipynb b/notebooks/STIS/calstis/calstis_2d_ccd.ipynb index 650fc891d..3f48fa437 100644 --- a/notebooks/STIS/calstis/calstis_2d_ccd.ipynb +++ b/notebooks/STIS/calstis/calstis_2d_ccd.ipynb @@ -376,6 +376,16 @@ "![Graph illustrating parallel serial overscan corresponding to wavelength in the x-axis and virtual overscan corresponding to position along slit in the y-axis.](figures/CCD_overscan.jpg)" ] }, + { + "cell_type": "markdown", + "id": "0f49f518-dac7-4587-b878-8d3b83f764f0", + "metadata": { + "tags": [] + }, + "source": [ + "\"Graph" + ] + }, { "cell_type": "markdown", "id": "03bcf323", @@ -384,9 +394,8 @@ "The electronic bias level is subtracted line-by-line. An initial value of electronic bias level is determined for each line of the image using the serial and parallel overscans, and a straight line is fitted to the bias as a function of image line. The intial electronic bias for each line is determined by taking the median of a predetermined subset of the trailing serial overscan pixels, which currently includes most trailing overscan region except the first and last three pixels, and pixels flagged with bad data quality flags. The actual overscan bias subtracted from the image is the value of the linear fit at a specific image line. The mean value of all overscan levels is written to the output SCI extension header as MEANBLEV.\n", "\n", "THE BLEVCORR step also trims the image of overscan. The size of the overscan regions depend on binning and whether the image if full frame or a subimage, and the locations of the overscan regions depend on which amplifier was used for readout. The number of pixels trimmed during CCD bias level correction on each side is given in the following table.\n", - "
\n", - " \n", - "
" + "\n", + "" ] }, { From 54591352125ac1650fbd5deb640a8dca36a2bcc3 Mon Sep 17 00:00:00 2001 From: dulude Date: Mon, 4 Dec 2023 14:23:47 -0500 Subject: [PATCH 16/30] calstis_2d_ccd.ipynb: reverted back to origional --- notebooks/STIS/calstis/calstis_2d_ccd.ipynb | 23 ++++++--------------- 1 file changed, 6 insertions(+), 17 deletions(-) diff --git a/notebooks/STIS/calstis/calstis_2d_ccd.ipynb b/notebooks/STIS/calstis/calstis_2d_ccd.ipynb index 3f48fa437..7c0083294 100644 --- a/notebooks/STIS/calstis/calstis_2d_ccd.ipynb +++ b/notebooks/STIS/calstis/calstis_2d_ccd.ipynb @@ -367,23 +367,10 @@ "The BLEVCORR step is part of basic 2-D image reduction for CCD data only. This step subtracts the electronic bias level for each line of the CCD image and trims the overscan regions off of the input image, leaving only the exposed portions of the image. \n", "\n", "Because the electronic bias level can vary with time and temperature, its value is determined from the overscan region in the particular exposure being processed. This bias is applied equally to real pixels (main detector and physical overscan) and the virtual overscan region (pixels that don't actually exist, but are recorded when the detector clocks out extra times after reading out all the parallel rows). A raw STIS CCD image in full frame unbinned mode has 19 leading and trailing columns of serial physical overscan in the AXIS1 (x direction), and 20 rows of virtual overscan in the AXIS2 (y direction); therefore the size of the uncalibrated and unbinned full framge CCD image is 1062(serial) $\\times$ 1044(parallel) pixels, with 1024 * 1024 exposed science pixels.\n", - "\n", - "\n", - "\"Graph\n", "\n", - "![Graph illustrating parallel serial overscan corresponding to wavelength in the x-axis and virtual overscan corresponding to position along slit in the y-axis.](figures/CCD_overscan.jpg)" - ] - }, - { - "cell_type": "markdown", - "id": "0f49f518-dac7-4587-b878-8d3b83f764f0", - "metadata": { - "tags": [] - }, - "source": [ - "\"Graph" + "
\n", + " \"Graph\n", + "
" ] }, { @@ -395,7 +382,9 @@ "\n", "THE BLEVCORR step also trims the image of overscan. The size of the overscan regions depend on binning and whether the image if full frame or a subimage, and the locations of the overscan regions depend on which amplifier was used for readout. The number of pixels trimmed during CCD bias level correction on each side is given in the following table.\n", "\n", - "" + "
\n", + " \n", + "
" ] }, { From b2db9828ccfb8778217bfc688d47a0b3a916cdf6 Mon Sep 17 00:00:00 2001 From: dulude Date: Mon, 4 Dec 2023 14:26:18 -0500 Subject: [PATCH 17/30] DataDl.ipynb: removed centering from first image --- notebooks/COS/DataDl/DataDl.ipynb | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/notebooks/COS/DataDl/DataDl.ipynb b/notebooks/COS/DataDl/DataDl.ipynb index f4f2455fd..7dd57ae41 100644 --- a/notebooks/COS/DataDl/DataDl.ipynb +++ b/notebooks/COS/DataDl/DataDl.ipynb @@ -193,9 +193,9 @@ "source": [ "The search page of the HST-specific interface is laid out as in Figure 1.1:\n", "### Fig 1.1\n", - "
\"An
\n", + "\"An\n", "\n", - "If you are accessing proprietary data, you will need to make an account or log in at the top right under \"MY ST\" (Fig 1.1, boxed in red). If you are accessing non-proprietary data, you may continue without logging in.\n" + "If you are accessing proprietary data, you will need to make an account or log in at the top right under \"MY ST\" (Fig 1.1, boxed in red). If you are accessing non-proprietary data, you may continue without logging in..\n" ] }, { @@ -894,7 +894,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.10.9" + "version": "3.8.12" }, "vscode": { "interpreter": { From 32a780f3f4a884c824ef7c1cd169ea628aa4addf Mon Sep 17 00:00:00 2001 From: Michael Dulude Date: Tue, 5 Dec 2023 10:06:13 -0500 Subject: [PATCH 18/30] Add WFC3 notebook 'WFC3_UVIS_Pixel_Area_Map_Corrections_for_Subarrays.ipynb' (#112) * updated _toc.yml and _config.yml files * WFC3_UVIS_Pixel_Area_Map_Corrections_for_Subarrays.ipynb: cleared notebook outputs. * updating requirements.txt, some edits to notebook * pep8 corrections and adding a section to the notebook to automatically downlaod necessary files from the WFC3 webpage * PEP8 corrections --------- Co-authored-by: annierose3 --- _config.yml | 1 - _toc.yml | 2 +- ...l_Area_Map_Corrections_for_Subarrays.ipynb | 100 +++++++----------- .../uvis_pam_corrections/requirements.txt | 2 +- 4 files changed, 38 insertions(+), 67 deletions(-) diff --git a/_config.yml b/_config.yml index 8cd1c3728..894344bab 100644 --- a/_config.yml +++ b/_config.yml @@ -59,7 +59,6 @@ exclude_patterns: [notebooks/DrizzlePac/align_mosaics/align_mosaics.ipynb, notebooks/WFC3/ir_scattered_light_manual_corrections/Correcting_for_Scattered_Light_in_IR_Exposures_by_Manually_Subtracting_Bad_Reads.ipynb, notebooks/WFC3/photometry_examples/phot_examples.ipynb, notebooks/WFC3/tvb_flattenramp/TVB_flattenramp_notebook.ipynb, - notebooks/WFC3/uvis_pam_corrections/WFC3_UVIS_Pixel_Area_Map_Corrections_for_Subarrays.ipynb, notebooks/WFC3/uvis_time_dependent_photometry/uvis_timedep_phot.ipynb, notebooks/WFC3/zeropoints/zeropoints.ipynb] diff --git a/_toc.yml b/_toc.yml index 8e6eb2729..f5cbbd213 100644 --- a/_toc.yml +++ b/_toc.yml @@ -72,6 +72,6 @@ parts: - file: notebooks/WFC3/persistence/wfc3_ir_persistence.ipynb # - file: notebooks/WFC3/photometry_examples/phot_examples.ipynb # - file: notebooks/WFC3/tvb_flattenramp/TVB_flattenramp_notebook.ipynb -# - file: notebooks/WFC3/uvis_pam_corrections/WFC3_UVIS_Pixel_Area_Map_Corrections_for_Subarrays.ipynb + - file: notebooks/WFC3/uvis_pam_corrections/WFC3_UVIS_Pixel_Area_Map_Corrections_for_Subarrays.ipynb # - file: notebooks/WFC3/uvis_time_dependent_photometry/uvis_timedep_phot.ipynb # - file: notebooks/WFC3/zeropoints/zeropoints.ipynb diff --git a/notebooks/WFC3/uvis_pam_corrections/WFC3_UVIS_Pixel_Area_Map_Corrections_for_Subarrays.ipynb b/notebooks/WFC3/uvis_pam_corrections/WFC3_UVIS_Pixel_Area_Map_Corrections_for_Subarrays.ipynb index e15349d31..4be33422f 100644 --- a/notebooks/WFC3/uvis_pam_corrections/WFC3_UVIS_Pixel_Area_Map_Corrections_for_Subarrays.ipynb +++ b/notebooks/WFC3/uvis_pam_corrections/WFC3_UVIS_Pixel_Area_Map_Corrections_for_Subarrays.ipynb @@ -86,7 +86,7 @@ "- *astroquery* for downloading data from MAST\n", "- *matplotlib.pyplot* for plotting data\n", "- *ginga* for finding min/max outlier pixels\n", - "- *shutil* for copying files from one directory to another" + "- *urlib* for downloading files from a webpage" ] }, { @@ -100,11 +100,10 @@ "outputs": [], "source": [ "%matplotlib inline\n", - "import shutil\n", "import matplotlib.pyplot as plt\n", "import numpy as np\n", + "import urllib.request\n", "from astropy.io import fits\n", - "from astroquery.mast import Mast\n", "from astroquery.mast import Observations\n", "from ginga.util.zscale import zscale" ] @@ -305,12 +304,12 @@ "source": [ "data_512 = hdu_512[1].data\n", "\n", - "#We use zcale to find the min and max for plotting\n", + "# We use zcale to find the min and max for plotting\n", "vmin_512, vmax_512 = zscale(data_512)\n", "\n", "im = plt.imshow(data_512, vmin=vmin_512, vmax=vmax_512, origin='lower')\n", "clb = plt.colorbar(im)\n", - "_= clb.ax.set_title('Electrons')" + "_ = clb.ax.set_title('Electrons')" ] }, { @@ -369,7 +368,7 @@ "x1 = int(x0 + scihdr_512['NAXIS1'])\n", "y1 = int(y0 + scihdr_512['NAXIS2'])\n", "\n", - "print (f'(x0, y0, x1, y1) = ({x0}, {y0}, {x1}, {y1})')" + "print(f'(x0, y0, x1, y1) = ({x0}, {y0}, {x1}, {y1})')" ] }, { @@ -390,16 +389,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The UVIS CCD has two chips: UVIS1 and UVIS2.\n", - "\n", - "First, please go to the [WFC3 PAM website](https://www.stsci.edu/hst/instrumentation/wfc3/data-analysis/pixel-area-maps) and download the UVIS1 and UVIS2 PAMs under the \"Download Pixel Area Maps\" header. " - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Now, we will use `shutil.copy` to copy the PAMs from your Downloads directory to the present working directory. " + "The UVIS CCD has two chips: UVIS1 and UVIS2. We will download the UVIS1 and UVIS2 PAMs from the [WFC3 PAM website](https://www.stsci.edu/hst/instrumentation/wfc3/data-analysis/pixel-area-maps)." ] }, { @@ -408,35 +398,27 @@ "metadata": {}, "outputs": [], "source": [ - "#Please add the path to your local directory in place of \"pwd\" below\n", - "#Please add the path to your Downloads folder in place of \"downloads\" below\n", - "downloads = '/Users/yourHomeDirectory/Downloads/'\n", - "pwd = '/Users/yourHomeDirectory/PAM_notebook/'" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# Copy the content of source to destination\n", - "source_1 = downloads + 'UVIS1wfc3_map.fits'\n", - "source_2 = downloads + 'UVIS2wfc3_map.fits'\n", + "filenames = [\"UVIS1wfc3_map.fits\", \"UVIS2wfc3_map.fits\"]\n", + "\n", + "try:\n", + " for filename in filenames:\n", + " url = (\"https://www.stsci.edu/files/live/sites/www/files/home/hst/instrumentation/\" + \n", + " \"wfc3/data-analysis/pixel-area-maps/_documents/\")\n", + " url += filename\n", "\n", - "dest_path1 = shutil.copy(source_1, pwd) \n", - "dest_path2 = shutil.copy(source_2, pwd) \n", + " with urllib.request.urlopen(url) as response:\n", + " with open(filename, \"wb\") as out_file:\n", + " out_file.write(response.read())\n", "\n", - "# Print path of newly created file\n", - "print(\"Destination path for the UVIS1 PAM:\", dest_path1)\n", - "print(\"Destination path for the UVIS2 PAM:\", dest_path2)" + "except Exception as e:\n", + " print(f\"An error occured: {e}\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Now we have our WFC3/UVIS PAM files in our working directory." + "Now, we should have our WFC3/UVIS PAM files in our working directory." ] }, { @@ -463,14 +445,14 @@ "outputs": [], "source": [ "if scihdr_512['CCDCHIP'] == 1:\n", - " pam = fits.getdata('UVIS1wfc3_map.fits')\n", - " pamcorr_data_512 = data_512 * pam[y0:y1, x0:x1]\n", + " pam = fits.getdata('UVIS1wfc3_map.fits')\n", + " pamcorr_data_512 = data_512 * pam[y0:y1, x0:x1]\n", "\n", "elif scihdr_512['CCDCHIP'] == 2:\n", - " pam = fits.getdata('UVIS2wfc3_map.fits')\n", - " pamcorr_data_512 = data_512 * pam[y0:y1, x0:x1]\n", + " pam = fits.getdata('UVIS2wfc3_map.fits')\n", + " pamcorr_data_512 = data_512 * pam[y0:y1, x0:x1]\n", "else:\n", - " raise Exception('Chip case not handled.')\n" + " raise Exception('Chip case not handled.')" ] }, { @@ -488,14 +470,13 @@ "source": [ "diff_data_512 = (pamcorr_data_512 - data_512)\n", "\n", - "\n", - "#We use zcale to find the min and max for plotting\n", + "# We use zcale to find the min and max for plotting\n", "vmin_diff_512, vmax_diff_512 = zscale(diff_data_512)\n", "\n", "im = plt.imshow(diff_data_512, vmin=vmin_diff_512, vmax=vmax_diff_512, origin='lower')\n", "\n", "clb = plt.colorbar(im)\n", - "_= clb.ax.set_title('Electrons')" + "_ = clb.ax.set_title('Electrons')" ] }, { @@ -536,14 +517,12 @@ " pamcorr_data : array\n", " PAM-corrected data.\n", " \"\"\"\n", - "\n", " data = np.copy(data)\n", " x0 = int(np.abs(scihdr['LTV1']))\n", " y0 = int(np.abs(scihdr['LTV2']))\n", " x1 = int(x0 + scihdr['NAXIS1'])\n", " y1 = int(y0 + scihdr['NAXIS2'])\n", " \n", - "\n", " if scihdr['CCDCHIP'] == 1:\n", " pam = fits.getdata(pamdir + 'UVIS1wfc3_map.fits')\n", " pamcorr_data = data * pam[y0:y1, x0:x1]\n", @@ -582,12 +561,12 @@ "metadata": {}, "outputs": [], "source": [ - "#We use zcale to find the min and max for plotting\n", + "# We use zcale to find the min and max for plotting\n", "vmin_1024, vmax_1024 = zscale(data_1024)\n", "\n", "im = plt.imshow(data_1024, vmin=vmin_1024, vmax=vmax_1024, origin='lower')\n", "clb = plt.colorbar(im)\n", - "_= clb.ax.set_title('Electrons')" + "_ = clb.ax.set_title('Electrons')" ] }, { @@ -607,13 +586,13 @@ "source": [ "diff_data_1024 = (pamcorr_data_1024-data_1024)\n", "\n", - "#We use zcale to find the min and max for plotting\n", + "# We use zcale to find the min and max for plotting\n", "vmin_diff_1024, vmax_diff_1024 = zscale(diff_data_1024)\n", "\n", "im = plt.imshow(diff_data_1024, vmin=vmin_diff_1024, vmax=vmax_diff_1024, origin='lower')\n", "\n", "clb = plt.colorbar(im)\n", - "_= clb.ax.set_title('Electrons')" + "_ = clb.ax.set_title('Electrons')" ] }, { @@ -641,12 +620,12 @@ "metadata": {}, "outputs": [], "source": [ - "#We use zcale to find the min and max for plotting\n", + "# We use zcale to find the min and max for plotting\n", "vmin_2048, vmax_2048 = zscale(data_2048)\n", "\n", "im = plt.imshow(data_2048, vmin=vmin_2048, vmax=vmax_2048, origin='lower')\n", "clb = plt.colorbar(im)\n", - "_= clb.ax.set_title('Electrons')" + "_ = clb.ax.set_title('Electrons')" ] }, { @@ -666,13 +645,13 @@ "source": [ "diff_data_2048 = (pamcorr_data_2048-data_2048)\n", "\n", - "#We use zcale to find the min and max for plotting\n", + "# We use zcale to find the min and max for plotting\n", "vmin_diff_2048, vmax_diff_2048 = zscale(diff_data_2048)\n", "\n", "im = plt.imshow(diff_data_2048, vmin=vmin_diff_2048, vmax=vmax_diff_2048, origin='lower')\n", "\n", "clb = plt.colorbar(im)\n", - "_= clb.ax.set_title('Electrons')" + "_ = clb.ax.set_title('Electrons')" ] }, { @@ -738,13 +717,6 @@ "[Top of Page](#top)\n", "\"Space " ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [] } ], "metadata": { @@ -763,9 +735,9 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.11.0" + "version": "3.8.12" } }, "nbformat": 4, - "nbformat_minor": 2 + "nbformat_minor": 4 } diff --git a/notebooks/WFC3/uvis_pam_corrections/requirements.txt b/notebooks/WFC3/uvis_pam_corrections/requirements.txt index 6de0ed093..ee67db939 100644 --- a/notebooks/WFC3/uvis_pam_corrections/requirements.txt +++ b/notebooks/WFC3/uvis_pam_corrections/requirements.txt @@ -1,5 +1,5 @@ astropy==5.2.1 astroquery==0.4.6 -ginga==4.1.1 +ginga==4.0.1 matplotlib==3.7.0 numpy==1.23.4 From aba55b0a0988ae24f2aa4f7ef3eb58a0100a59d8 Mon Sep 17 00:00:00 2001 From: srosagomez <113535721+srosagomez@users.noreply.github.com> Date: Tue, 5 Dec 2023 15:21:53 -0500 Subject: [PATCH 19/30] edit filepath to get rid of error (#153) * edit filepath to get rid of error * update requirements.txt * remove output from notebook run --- notebooks/COS/DayNight/DayNight.ipynb | 8 ++++---- notebooks/COS/DayNight/requirements.txt | 1 + 2 files changed, 5 insertions(+), 4 deletions(-) diff --git a/notebooks/COS/DayNight/DayNight.ipynb b/notebooks/COS/DayNight/DayNight.ipynb index e7c142ef4..4971ea600 100644 --- a/notebooks/COS/DayNight/DayNight.ipynb +++ b/notebooks/COS/DayNight/DayNight.ipynb @@ -550,7 +550,7 @@ "\n", "# The next line depends on your context and pmap file\n", "# You can find the latest pmap file at https://hst-crds.stsci.edu\n", - "!crds bestrefs --files **/*corrtag*.fits --sync-references=2 --update-bestrefs --new-context 'hst_1078.pmap'" + "!crds bestrefs --files **/*corrtag*.fits --sync-references=2 --update-bestrefs --new-context 'hst_1123.pmap'" ] }, { @@ -631,7 +631,7 @@ "outputs": [], "source": [ "# Read in the wvln, flux, flux error of the *UNfiltered* spectrum file\n", - "unfilt_tab = Table.read(\"./output/full_data_outs/\"+filename_root+\"_x1dsum.fits\")['WAVELENGTH', 'FLUX', 'ERROR']\n", + "unfilt_tab = Table.read(f\"./output/full_data_outs/{filename_root}_x1dsum.fits\")['WAVELENGTH', 'FLUX', 'ERROR']\n", "\n", "# Read in the wvln, flux, flux error of the *filtered* spectrum file\n", "filt_tab = Table.read(\"./output/filtered_data_outs/filtered_x1dsum.fits\")['WAVELENGTH', 'FLUX', 'ERROR']\n", @@ -820,7 +820,7 @@ "\n", "**Contributors:** Elaine Mae Frazer\n", "\n", - "**Updated On:** 2023-06-26\n", + "**Updated On:** 2023-12-05\n", "\n", "> *This tutorial was generated to be in compliance with the [STScI style guides](https://github.com/spacetelescope/style-guides) and would like to cite the [Jupyter guide](https://github.com/spacetelescope/style-guides/blob/master/templates/example_notebook.ipynb) in particular.*\n", "\n", @@ -868,7 +868,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.9.13" + "version": "3.11.0" } }, "nbformat": 4, diff --git a/notebooks/COS/DayNight/requirements.txt b/notebooks/COS/DayNight/requirements.txt index 046dd15fd..c4934a223 100644 --- a/notebooks/COS/DayNight/requirements.txt +++ b/notebooks/COS/DayNight/requirements.txt @@ -4,3 +4,4 @@ calcos==3.4.4 costools==1.2.6 matplotlib==3.7.0 numpy==1.23.4 +crds==11.17.13 \ No newline at end of file From 50f82099032add5a22ddbde1075bfbec7d25908f Mon Sep 17 00:00:00 2001 From: dulude Date: Mon, 11 Dec 2023 10:55:09 -0500 Subject: [PATCH 20/30] DataDl.ipynb: removed centering from the rest of the images --- notebooks/COS/DataDl/DataDl.ipynb | 24 ++++++++++++------------ 1 file changed, 12 insertions(+), 12 deletions(-) diff --git a/notebooks/COS/DataDl/DataDl.ipynb b/notebooks/COS/DataDl/DataDl.ipynb index 7dd57ae41..28642bef9 100644 --- a/notebooks/COS/DataDl/DataDl.ipynb +++ b/notebooks/COS/DataDl/DataDl.ipynb @@ -209,7 +209,7 @@ "* Are taken with the COS instrument, using the G160M grating and either the 1533 or the 1577 cenwave setting\n", "\n", "### Fig 1.2\n", - "
\"\"
" + "\"New" ] }, { @@ -219,7 +219,7 @@ "The above search results in the table shown in Figure 1.3. \n", "\n", "### Fig 1.3\n", - "
\"The
\n", + "\"The\n", "\n", "If you need to change some parameters in your search - for instance, to also find data from the G130M grating - click on \"Edit Search\" (Fig 1.3, red dashed box).\n", "\n", @@ -234,7 +234,7 @@ "Most COS spectra have preview images (simple plots of flux by wavelength) which can be viewed before downloading the data. Clicking the dataset name (Fig 1.3, blue dashed oval) will take you to a page which shows the preview image, as well as some basic information about the data and whether there were any known failures during the operation. An example of such a page is shown in Figure 1.4.\n", "\n", "### Fig 1.4\n", - "
\"The
" + "\"The" ] }, { @@ -244,10 +244,10 @@ "Returning to the results page shown in Fig 1.3 and clicking \"Download Data\" opens a window as shown in Figure 1.5. In this window you can search for filetypes using the search bar, and unselect/select all the data products shown in the filtered list (Fig 1.5, green circle). Clicking the \"Showing all filetypes\" box (Figure 1.5, red box) shows a drop-down (Figure 1.6) where you can choose to show/hide certain types of data such as the uncalibrated data. \n", "\n", "### Fig 1.5\n", - "
\"The
\n", + "\"The\n", "\n", "### Fig 1.6\n", - "
\"The
\n", + "\"The\n", "\n", "When all of your desired data products are checked, click \"Start Download\" (Fig 1.5, yellow dashed box). This will download a compressed \"zipped\" folder of all of your data, divided into subdirectories by the observation. Most operating systems can decompress these folders by default. For help decompressing the zipped files, you can follow these links for: [Windows](https://support.microsoft.com/en-us/windows/zip-and-unzip-files-8d28fa72-f2f9-712f-67df-f80cf89fd4e5) and [Mac](https://support.apple.com/guide/mac-help/zip-and-unzip-files-and-folders-on-mac-mchlp2528/mac). There are numerous ways to do this on Linux, however we have not vetted them." ] @@ -264,7 +264,7 @@ "\n", "### Fig 1.7\n", "\n", - "
\"The
" + "\"The" ] }, { @@ -347,7 +347,7 @@ "Because we are searching by Dataset ID, we don't need to specify any additional parameters to narrow down the data.\n", "\n", "### Fig 1.8\n", - "
\"The
\n" + "\"The\n" ] }, { @@ -357,7 +357,7 @@ "We now can access all the datasets specified in `obsId_list.txt`, as shown in Figure 1.9:\n", "\n", "### Fig 1.9\n", - "
\"The
\n", + "\"The\n", "\n", "We can select and download their data products as before." ] @@ -375,11 +375,11 @@ "\n", "Navigate to the MAST Portal at , and you will be greeted by a screen where the top looks like Figure 1.10. \n", "### Fig 1.10\n", - "
\"The
\n", + "\"The\n", "\n", "Click on \"Advanced Search\" (boxed in red in Figure 1.10). This will open up a new search tab, as shown in Figure 1.11:\n", "### Fig 1.11\n", - "
\"The
\n", + "\"The\n", "\n", "Fig 1.11 (above) shows the default search fields which appear. Depending on what you are looking for, these may or may not be the most helpful search fields. By unchecking some of the fields which we are not interested in searching by right now (Figure 1.12, boxed in green), and then entering the parameter values by which to narrow the search into each parameter's box, we generate Fig 1.12. One of the six fields (Mission) by which we are narrowing is boxed in a dashed blue line. The list of applied filters is boxed in red. A dashed pink box at the top left indicates that 2 records were found matching all of these parameters. To its left is an orange box around the \"Search\" button to press to bring up the list of results.\n", "\n", @@ -395,7 +395,7 @@ "|Product Type|spectrum|\n", "\n", "### Fig 1.12\n", - "
\"The
\n", + "\"The\n", "\n" ] }, @@ -443,7 +443,7 @@ "metadata": {}, "source": [ "### Fig 1.14\n", - "
\"The
\n", + "\"The\n", "\n", "Each dataset contains *many* files, most of which are calibration files or intermediate processing files. You may or may not want some of these intermediate files in addition to the final product file.\n", "In the leftmost \"Filters\" section of the Download Basket page, you can narrow which files will be downloaded (Fig 1.14, boxed in red).\n", From 6a1da4b697cb1110be8e23757ab8b2b32e91712e Mon Sep 17 00:00:00 2001 From: dulude Date: Mon, 11 Dec 2023 11:41:42 -0500 Subject: [PATCH 21/30] COS/Setup.ipynb: fixed three images. --- notebooks/COS/Setup/Setup.ipynb | 13 +++---------- 1 file changed, 3 insertions(+), 10 deletions(-) diff --git a/notebooks/COS/Setup/Setup.ipynb b/notebooks/COS/Setup/Setup.ipynb index 74f724ff6..a4eddbb72 100644 --- a/notebooks/COS/Setup/Setup.ipynb +++ b/notebooks/COS/Setup/Setup.ipynb @@ -230,22 +230,15 @@ "First, we will check the CRDS website to determine what the current context is, as it changes regularly. In your browser, navigate to [the HST CRDS homepage](https://hst-crds.stsci.edu), and you will see a page as in Fig. 3.1:\n", "\n", "### Fig 3.1\n", - "\n", - "\n", - "![The CRDS homepage. There is a dropdown which lists all Hubble instruments. The table at the bottom of the page lists historical contexts.](figures/crds1.png \"CRDS Homepage\")\n", + "\"The\n", "\n", "At the bottom of this page is a list of recent contexts, titled \"Context History\". Clicking the context listed with the Status \"Operational\" (circled in red in Fig 3.1) will take you to that context's page, as shown in Fig. 3.2:\n", "\n", "### Fig 3.2\n", - "\n", - "\n", - "![The CRDS Historical References context page on the CRDS site. Each instrument is listed on the dropdown.\"> \n", - "By clicking the \"cos\" tab, (circled in red), you will be open up the tab, showing a page similar to Fig. 3.3, where you can find the current COS instrument context file: `hst_cos_.imap`. This filename is circled in red in Fig. 3.3.](./figures/crds2.png \"CRDS current 'Historical References' context page\")\n", + "\"The\n", "\n", "### Fig 3.3\n", - "\n", - "\n", - "![Showing the current COS context on the CRDS site. There is a large list of reference files, and the imap file labeled at the top of this list as hst_cos_0320.imap](figures/crds3.png \"Current COS context on the CRDS site\")\n", + "\"Showing\n", "\n", "Note down or copy the filename you just found." ] From ed90924a513525d1be1e558114c000f6842a4ea3 Mon Sep 17 00:00:00 2001 From: dulude Date: Mon, 11 Dec 2023 11:46:42 -0500 Subject: [PATCH 22/30] COS/LSF.ipynb: fixed two images. --- notebooks/COS/LSF/LSF.ipynb | 7 ++----- 1 file changed, 2 insertions(+), 5 deletions(-) diff --git a/notebooks/COS/LSF/LSF.ipynb b/notebooks/COS/LSF/LSF.ipynb index 9e2854ab4..396fb98a0 100644 --- a/notebooks/COS/LSF/LSF.ipynb +++ b/notebooks/COS/LSF/LSF.ipynb @@ -298,15 +298,12 @@ "The COS team maintains up-to-date LSF files on the [COS Spectral Resolution page](https://www.stsci.edu/hst/instrumentation/cos/performance/spectral-resolution). Opening up this link leads to a page like that shown in Fig. 1.1, where the LSF files are discussed in detail. The bottom part of this page has links to all the relavent files. The links at the top of the page will take you to the relevant section. In Fig. 1.1, we have circled in black the link to the section pertaining to our data: FUV at the Lifetime Position: 3.\n", "\n", "###
Fig 1.1: Screenshot of the COS Spectral Resolution Site
\n", - "\n", - "![COS Spectral Resolution Site](figures/LSFHomepage.png \"COS Spectral Resolution Site\")\n", + "\"COS\n", "\n", "Clicking on the circled link takes us to the table of hyperlinks to all the files perataining to data taken with the FUV, Lifetime Postition 3 configutation, shown in Fig. 1.2:\n", "\n", "###
Fig 1.2: Screenshot of the COS Spectral Resolution Site - Focus on LP-POS 3
\n", - "\n", - "\n", - "![COS Spectral Resolution Site - Lifetime Position 3](figures/LSFHomepage2.png \"COS Spectral Resolution Site - Lifetime Position 3\")\n", + "\"COS\n", "\n", "Circled in solid red is the button to download the LSF file we need for our data with CENWAVE = 1291. Circled in dashed black is the corresponding CDSF.\n", "\n", From 36b11b29e3c6da5209e8455d63e64fa849eceac3 Mon Sep 17 00:00:00 2001 From: dulude Date: Mon, 11 Dec 2023 11:54:42 -0500 Subject: [PATCH 23/30] COS/DataDl.ipynb: fixed a bunch of images. --- notebooks/COS/DataDl/DataDl.ipynb | 57 ++++++++----------------------- 1 file changed, 14 insertions(+), 43 deletions(-) diff --git a/notebooks/COS/DataDl/DataDl.ipynb b/notebooks/COS/DataDl/DataDl.ipynb index 1afce398c..165c922c1 100644 --- a/notebooks/COS/DataDl/DataDl.ipynb +++ b/notebooks/COS/DataDl/DataDl.ipynb @@ -193,9 +193,7 @@ "source": [ "The search page of the HST-specific interface is laid out as in Figure 1.1:\n", "### Fig 1.1\n", - "\n", - "\n", - "![An image of the MAST search site. On the page there are boxes for each different criteria that you want in your search, such as Object Name in the top left, active instruments below that, and at the bottom of the page there is a box that you can fill with a certain column name and your desired criteria (if the criteria is not already on the page). At the top right is the 'My ST' button.](./figures/new_hst_search_login.png \"New HST-specific search website\")\n", + "\"An\n", "\n", "If you are accessing proprietary data, you will need to make an account or log in at the top right under \"MY ST\" (Fig 1.1, boxed in red). If you are accessing non-proprietary data, you may continue without logging in.\n" ] @@ -211,9 +209,7 @@ "* Are taken with the COS instrument, using the G160M grating and either the 1533 or the 1577 cenwave setting\n", "\n", "### Fig 1.2\n", - "\n", - "\n", - "![New HST-specific website search filled out with a COS data query, boxes with each criteria from above are highlighted with a red box. The central wavelength condition was added to the bottom box where you choose columns for criteria, since the central wavelength is not a pre-given criteria.](figures/hst_search_query_updated.png \"New HST-specific website search filled out with a COS data query\")" + "\"New" ] }, { @@ -223,9 +219,7 @@ "The above search results in the table shown in Figure 1.3. \n", "\n", "### Fig 1.3\n", - "\n", - "\n", - "![The page is now the search results page. At the top left are the 'Edit Search' box highlighted in dashed red. Below that is the 'Download Dataset' button highlighted with a green circle. Under that is the list of datasets, with each row beginning with a empty checkbox, followed by information about the dataset, such as search position, dataset name, target name, etc.](figures/new_hst_search_results.png \"Results from new HST-specific search website query\")\n", + "\"The\n", "\n", "If you need to change some parameters in your search - for instance, to also find data from the G130M grating - click on \"Edit Search\" (Fig 1.3, red dashed box).\n", "\n", @@ -240,9 +234,7 @@ "Most COS spectra have preview images (simple plots of flux by wavelength) which can be viewed before downloading the data. Clicking the dataset name (Fig 1.3, blue dashed oval) will take you to a page which shows the preview image, as well as some basic information about the data and whether there were any known failures during the operation. An example of such a page is shown in Figure 1.4.\n", "\n", "### Fig 1.4\n", - "\n", - "\n", - "![The page is now an image of a spectrum preview for dataset LPXK51020. It shows two plots of wavelength vs flux. Below is a list of exposure information, such as observation data, exposure time, release date, mode. At the very bottom of the page is the proposal ID along with the PI and CoIs.](figures/preview_spectrum_small.png \"Preview spectrum page\")" + "\"The" ] }, { @@ -252,15 +244,10 @@ "Returning to the results page shown in Fig 1.3 and clicking \"Download Data\" opens a window as shown in Figure 1.5. In this window you can search for filetypes using the search bar, and unselect/select all the data products shown in the filtered list (Fig 1.5, green circle). Clicking the \"Showing all filetypes\" box (Figure 1.5, red box) shows a drop-down (Figure 1.6) where you can choose to show/hide certain types of data such as the uncalibrated data. \n", "\n", "### Fig 1.5\n", - "\n", - "\n", - "![The image shows the a table of files in dataset LDJ1010. Each row shows different CalCOS products, such as the ASN file, X1DSUM, JIT, etc. The title of the table is the dataset name, and to the left of the name is the button to check all or none exposures, highlighted with a green circle. There are columns on the table that give information about the insturment used (COS) and the filter/grating (G140L). Above the table is a dropdown that allows you to choose the filetypes you wish to download.](figures/new_hst_search_downloading_all_files.png \"Choosing what to download in the new HST search website\")\n", + "\"The\n", "\n", "### Fig 1.6\n", - "\n", - "\n", - "![The image shows the same page as Figure 1.5, but the 'Showing all file types' dropdown button has been clicked. There are rows that have checkboxes on the left, each row from top to bottom is labled 'Calibrated', 'Uncalibrated', 'Aux/Other', 'Reference', 'Log/Jitter'](figures/new_hst_search_downloading_filetypes.png \"Choosing which data to download (calibrated/uncalibrated/etc.\")\n", + "\"The\n", "\n", "When all of your desired data products are checked, click \"Start Download\" (Fig 1.5, yellow dashed box). This will download a compressed \"zipped\" folder of all of your data, divided into subdirectories by the observation. Most operating systems can decompress these folders by default. For help decompressing the zipped files, you can follow these links for: [Windows](https://support.microsoft.com/en-us/windows/zip-and-unzip-files-8d28fa72-f2f9-712f-67df-f80cf89fd4e5) and [Mac](https://support.apple.com/guide/mac-help/zip-and-unzip-files-and-folders-on-mac-mchlp2528/mac). There are numerous ways to do this on Linux, however we have not vetted them." ] @@ -277,9 +264,7 @@ "\n", "### Fig 1.7\n", "\n", - "\n", - "![The image is showing the search page for MAST again. Halfway down the page is a box to input dataset name criteria. Above that is the obervation type box, which is composed of three check boxes. These boxes from left to right are 'All', 'Science', and 'Calibration'. The science box is the only one that is checked.](figures/new_hst_search_query2_small.png \"New HST-specific website search filled out with a specific dataset ID\")" + "\"The" ] }, { @@ -362,9 +347,7 @@ "Because we are searching by Dataset ID, we don't need to specify any additional parameters to narrow down the data.\n", "\n", "### Fig 1.8\n", - "\n", - "\n", - "![The image shows the MAST search page, with the top box filled in. The title of this box is 'Object name(s) and/or RA and Dec pair(s). The filled in box is the text file obsId_list.txt. The observations box below it has only the science box checked off.](figures/new_search_file_list_small.png \"File Upload Search Form\")\n" + "\"The" ] }, { @@ -374,9 +357,7 @@ "We now can access all the datasets specified in `obsId_list.txt`, as shown in Figure 1.9:\n", "\n", "### Fig 1.9\n", - "\n", - "\n", - "![The image shows the MAST search results page, with the three datasets from our text file shows in each row.](figures/new_search_file_list_res_small.png \"Upload List of Objects Search Results\")\n", + "\"The\n", "\n", "We can select and download their data products as before." ] @@ -394,15 +375,11 @@ "\n", "Navigate to the MAST Portal at , and you will be greeted by a screen where the top looks like Figure 1.10. \n", "### Fig 1.10\n", - "\n", - "\n", - "![The image shows the MAST portal, specifically the search box at the top.](figures/mastp_top.png \"Top of MAST Portal Home\")\n", + "\"The\n", "\n", "Click on \"Advanced Search\" (boxed in red in Figure 1.10). This will open up a new search tab, as shown in Figure 1.11:\n", "### Fig 1.11\n", - "\n", - "\n", - "![The image shows the advanced search pop-up. On the left of the pop-up is a list of check boxes for different search criteria, such as Target name, instrument, mission, etc. To the left of that are larger boxes for different critera, and inside these mobes are rows with checkboxes. For example, there is a larger box labeled 'Observation type' with two checkbox rows labeled 'science' and 'calibration'.](figures/mastp_adv.png \"The advanced search tab\")\n", + "\"The\n", "\n", "Fig 1.11 (above) shows the default search fields which appear. Depending on what you are looking for, these may or may not be the most helpful search fields. By unchecking some of the fields which we are not interested in searching by right now (Figure 1.12, boxed in green), and then entering the parameter values by which to narrow the search into each parameter's box, we generate Fig 1.12. One of the six fields (Mission) by which we are narrowing is boxed in a dashed blue line. The list of applied filters is boxed in red. A dashed pink box at the top left indicates that 2 records were found matching all of these parameters. To its left is an orange box around the \"Search\" button to press to bring up the list of results.\n", "\n", @@ -418,9 +395,7 @@ "|Product Type|spectrum|\n", "\n", "### Fig 1.12\n", - "\n", - "\n", - "![The image shows the same advanced search pop-up, filled out with our desired criteria. The criteria is highlighted, and above it is a list of the applied filters highlighted in red. The top of the pop-up has a 'Search' button, and also lists the records found. There are two records for our search.](figures/mastp_adv_2.png \"The advanced search tab with some selections\")\n", + "\"The\n", "\n" ] }, @@ -431,9 +406,7 @@ "Click the \"Search\" button (boxed in orange), and you will be brought to a page resembling Figure 1.13. \n", "\n", "### Fig 1.13\n", - "\n", - "\n", - "![The image shows the search results list, with consists of two rows for our search. To the left of the rows are checkboxes, then there is are images of a disk, spectra, and three dots. The rest of the columns of the rows show some criteria, such as the mission, observation type, etc. To the far right is the image of our object.](figures/mastp_res1.png \"Results of MAST Portal search\")" + "\"The" ] }, { @@ -470,9 +443,7 @@ "metadata": {}, "source": [ "### Fig 1.14\n", - "\n", - "\n", - "![The image shows the Download Manager. The left is a list of filters, where you can choose recommended products, product categories, extensions, and groups. To the left is a file list of all files for the observations. There are columns that label the file size, name, product type, etc.](figures/mastp_cart2.png \"MAST Portal Download Basket\")\n", + "\"The\n", "\n", "Each dataset contains *many* files, most of which are calibration files or intermediate processing files. You may or may not want some of these intermediate files in addition to the final product file.\n", "In the leftmost \"Filters\" section of the Download Basket page, you can narrow which files will be downloaded (Fig 1.14, boxed in red).\n", From bf54b2d7a394c76e99ef4c9033441217dfb6d151 Mon Sep 17 00:00:00 2001 From: Michael Dulude Date: Mon, 11 Dec 2023 12:01:43 -0500 Subject: [PATCH 24/30] Add WFC3 notebook 'wfc3_image_displayer_analyzer.ipynb' (#105) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit * updated _toc.yml and _config.yml files * added trailing spaces to display_image.py, row_column_stats.py * updated notebook-level requirements.txt file. * wfc3_image_displayer_analyzer.ipynb: cleared notebook outputs. * Update display_image.py edits for PEP8 compliance and removing `ginga.util` dependence * Update row_column_stats.py changes for PEP8 compliance * Update row_column_stats.py last missing PEP8 changes * Update requirements.txt removing version pins and adding jupyter * Update wfc3_image_displayer_analyzer.ipynb changes for PEP8 compliance and some minor formatting * Update README.md changes for new information about creating a virtual environment * Update wfc3_image_displayer_analyzer.ipynb missed some PEP8 stuff; hopefully the last commit 🫠 --------- Co-authored-by: bjkuhn --- _config.yml | 1 - _toc.yml | 2 +- .../WFC3/image_displayer_analyzer/README.md | 33 +- .../image_displayer_analyzer/display_image.py | 311 ++++++++---------- .../image_displayer_analyzer/requirements.txt | 8 +- .../row_column_stats.py | 107 +++--- .../wfc3_image_displayer_analyzer.ipynb | 165 +++++----- 7 files changed, 305 insertions(+), 322 deletions(-) diff --git a/_config.yml b/_config.yml index 894344bab..3cf6d03b8 100644 --- a/_config.yml +++ b/_config.yml @@ -53,7 +53,6 @@ exclude_patterns: [notebooks/DrizzlePac/align_mosaics/align_mosaics.ipynb, notebooks/WFC3/dash/dash.ipynb, notebooks/WFC3/filter_transformations/filter_transformations.ipynb, notebooks/WFC3/flux_conversion_tool/flux_conversion_tool.ipynb, - notebooks/WFC3/image_displayer_analyzer/wfc3_image_displayer_analyzer.ipynb, notebooks/WFC3/ir_ima_visualization/IR_IMA_Visualization_with_an_Example_of_Time_Variable_Background.ipynb, notebooks/WFC3/ir_scattered_light_calwf3_corrections/Correcting_for_Scattered_Light_in_IR_Exposures_Using_calwf3_to_Mask_Bad_Reads.ipynb, notebooks/WFC3/ir_scattered_light_manual_corrections/Correcting_for_Scattered_Light_in_IR_Exposures_by_Manually_Subtracting_Bad_Reads.ipynb, diff --git a/_toc.yml b/_toc.yml index f5cbbd213..89758b603 100644 --- a/_toc.yml +++ b/_toc.yml @@ -65,7 +65,7 @@ parts: - file: notebooks/WFC3/exception_report/wfc3_exception_report.ipynb # - file: notebooks/WFC3/filter_transformations/filter_transformations.ipynb # - file: notebooks/WFC3/flux_conversion_tool/flux_conversion_tool.ipynb -# - file: notebooks/WFC3/image_displayer_analyzer/wfc3_image_displayer_analyzer.ipynb + - file: notebooks/WFC3/image_displayer_analyzer/wfc3_image_displayer_analyzer.ipynb # - file: notebooks/WFC3/ir_ima_visualization/IR_IMA_Visualization_with_an_Example_of_Time_Variable_Background.ipynb # - file: notebooks/WFC3/ir_scattered_light_calwf3_corrections/Correcting_for_Scattered_Light_in_IR_Exposures_Using_calwf3_to_Mask_Bad_Reads.ipynb # - file: notebooks/WFC3/ir_scattered_light_manual_corrections/Correcting_for_Scattered_Light_in_IR_Exposures_by_Manually_Subtracting_Bad_Reads.ipynb diff --git a/notebooks/WFC3/image_displayer_analyzer/README.md b/notebooks/WFC3/image_displayer_analyzer/README.md index 4525ced2c..a7540300f 100755 --- a/notebooks/WFC3/image_displayer_analyzer/README.md +++ b/notebooks/WFC3/image_displayer_analyzer/README.md @@ -1,21 +1,26 @@ -In this tutorial, we present `display_image`, a tool for displaying full images with metadata, individual WFC3/UVIS chip images, a section of an image with various colormaps/scaling, and individual WFC3/IR `ima` reads. In addition, we present `row_column_stats`, a tool for computing row and column statistics for the types of WFC3 images previously mentioned. +In this tutorial, we present `display_image`, a tool for displaying full images with metadata, individual WFC3/UVIS chip images, +a section of an image with various colormaps/scaling, and individual WFC3/IR `ima` reads. In addition, we present +`row_column_stats`, a tool for computing row and column statistics for the types of WFC3 images previously mentioned. This directory, once unzipped, should contain this `README.md`, the image displayer tool `display_image.py`, the row and column statistic -tool `row_column_stats.py`, and the Jupyter Notebook tutorial -`wfc3_imageanalysis.ipynb`. Both of these tools are meant to be used inside of -a Jupyter Notebook. +tool `row_column_stats.py`, a `requirements.txt` file for creating a virtual +environment, and the Jupyter Notebook tutorial `wfc3_imageanalysis.ipynb`. +These tools are meant to be used inside a Jupyter Notebook. -In order to run the Jupyter Notebook you must create the virtual -environment in [WFC3 Library's](https://github.com/spacetelescope/WFC3Library) -installation instructions. No additional packages are required to run this -Jupyter Notebook. +To run this Jupyter Notebook, you must have created a virtual environment that contains (at minimum) the packages listed in the +requirements.txt file that is included within the repository. We recommend creating a new conda environment using the requirements file: -These tools (specifically `display_image`) look much better in Jupyter Lab -rather than the classic Jupyter Notebook. If your environment has Jupyter Lab -installed it's recommended you use that to run the `.ipynb` file. If you're -interested in adding Jupyter Lab to your environment see the install -instructions on the [Jupyter website](https://jupyter.org/install). +``` +$ conda create -n img_disp python=3.11 +$ conda activate img_disp +$ pip install -r requirements.txt +``` -Questions or concerns should be sent to the [HST Help Desk](https://stsci.service-now.com/hst). +The tools in this notebook (specifically `display_image`) look much +better in Jupyter Lab rather than in the classic Jupyter Notebook. If your +environment has Jupyter Lab installed it's recommended you use that to run the +.ipynb file. See the [Jupyter website](https://jupyter.org/install) for more info. + +Please submit any questions or comments to the [WFC3 Help Desk](https://stsci.service-now.com/hst). --------------------------------------------------------------------- diff --git a/notebooks/WFC3/image_displayer_analyzer/display_image.py b/notebooks/WFC3/image_displayer_analyzer/display_image.py index 922479239..b9f046808 100755 --- a/notebooks/WFC3/image_displayer_analyzer/display_image.py +++ b/notebooks/WFC3/image_displayer_analyzer/display_image.py @@ -1,23 +1,23 @@ #! /usr/bin/env python - -import numpy as np import sys from astropy.io import fits -from ginga.util import zscale +from astropy.visualization import ZScaleInterval import matplotlib.pyplot as plt +import numpy as np + def display_image(filename, - colormaps=['Greys_r','Greys_r','inferno_r'], - scaling=[(None,None),(None,None),(None,None)], + colormaps=['Greys_r', 'Greys_r', 'inferno_r'], + scaling=[(None, None), (None, None), (None, None)], printmeta=False, ima_multiread=False, - figsize=(18,18), + figsize=(18, 18), dpi=200): - - """ A function to display the 'SCI', 'ERR/WHT', and 'DQ/CTX' arrays - of any WFC3 fits image. This function returns nothing, but will display - the requested image on the screen when called. + """ + A function to display the 'SCI', 'ERR/WHT', and 'DQ/CTX' arrays + of any WFC3 fits image. This function returns nothing, but will display + the requested image on the screen when called. Authors ------- @@ -51,8 +51,8 @@ def display_image(filename, List of real numbers to act as scalings for the SCI, ERR, and DQ arrays. The first element in the list is for the SCI array the second is for the ERR array and the third element in the list is for the DQ extension. If - no scalings are given the default scaling will use - ginga.util.zscale.zscale(). All three scalings must be provided even if + no scalings are given the default scaling will use astropy.visualization + ZScaleInterval.get_limits(). All three scalings must be provided even if only changing 1-2 scalings. E.g. to change SCI array scaling: scaling = [(5E4,8E4),(None,None),(None,None)] @@ -66,11 +66,11 @@ def display_image(filename, plotted. If ima_multiread is set to False only the final read of the ima (ext 1) will be plotted. - figsize: (float,float) - The width, height of the figure. Default is (18,18) + figsize: (Float,Float) + The width, height of the figure. Default is (18,18). - dpi: float - The resolution of the figure in dots-per-inch. Default is 200 + dpi: Float + The resolution of the figure in dots-per-inch. Default is 200. Returns ------- @@ -108,7 +108,7 @@ def display_image(filename, print("Invalid image section specified") return 0, 0 try: - xstart = int(xsec[: xs]) + xstart = int(xsec[:xs]) except ValueError: print("Problem getting xstart") return @@ -132,7 +132,6 @@ def display_image(filename, print("Problem getting yend") return - bunit = get_bunit(h1) detector = h['detector'] issubarray = h['subarray'] si = h['primesi'] @@ -151,33 +150,32 @@ def display_image(filename, print('-'*44) print(f"Filter = {h['filter']}, Date-Obs = {h['date-obs']} T{h['time-obs']},\nTarget = {h['targname']}, Exptime = {h['exptime']}, Subarray = {issubarray}, Units = {h1['bunit']}\n") - if detector == 'UVIS': - if ima_multiread == True: + if ima_multiread is True: sys.exit("keyword argument 'ima_multiread' can only be set to True for 'ima.fits' files") try: if all_pixels: xstart = 0 ystart = 0 - xend = naxis1 # full x size + xend = naxis1 # full x size yend = naxis2*2 # full y size with fits.open(imagename) as hdu: - uvis2_sci = hdu["SCI",1].data + uvis2_sci = hdu["SCI", 1].data uvis2_err = hdu[2].data uvis2_dq = hdu[3].data - uvis1_sci = hdu["SCI",2].data + uvis1_sci = hdu["SCI", 2].data uvis1_err = hdu[5].data uvis1_dq = hdu[6].data try: - fullsci = np.concatenate([uvis2_sci,uvis1_sci]) - fulldq = np.concatenate([uvis2_dq,uvis1_dq]) - fullerr = np.concatenate([uvis2_err,uvis1_err]) + fullsci = np.concatenate([uvis2_sci, uvis1_sci]) + fulldq = np.concatenate([uvis2_dq, uvis1_dq]) + fullerr = np.concatenate([uvis2_err, uvis1_err]) - fullsci = fullsci[ystart:yend,xstart:xend] - fulldq = fulldq[ystart:yend,xstart:xend] - fullerr = fullerr[ystart:yend,xstart:xend] + fullsci = fullsci[ystart:yend, xstart:xend] + fulldq = fulldq[ystart:yend, xstart:xend] + fullerr = fullerr[ystart:yend, xstart:xend] make1x3plot(scaling, colormaps, fullsci, fullerr, fulldq, xstart, xend, ystart, yend, @@ -185,26 +183,26 @@ def display_image(filename, figsize, dpi) except ValueError: - fullsci = np.concatenate([uvis2_sci,uvis1_sci]) - fullsci = fullsci[ystart:yend,xstart:xend] + fullsci = np.concatenate([uvis2_sci, uvis1_sci]) + fullsci = fullsci[ystart:yend, xstart:xend] - z1_sci, z2_sci = get_scale_limits(scaling[0],fullsci,'SCI') + z1_sci, z2_sci = get_scale_limits(scaling[0], fullsci, 'SCI') - fig, ax1 = plt.subplots(1,1,figsize=figsize,dpi=dpi) - im1 = ax1.imshow(fullsci,origin='lower',extent=(xstart,xend,ystart,yend),cmap=colormaps[0],vmin=z1_sci, vmax=z2_sci) + fig, ax1 = plt.subplots(1, 1, figsize=figsize, dpi=dpi) + im1 = ax1.imshow(fullsci, origin='lower', extent=(xstart, xend, ystart, yend), cmap=colormaps[0], vmin=z1_sci, vmax=z2_sci) if len(fname) > 18: ax1.set_title(f"WFC3/{detector} {fname}\n{h1['extname']} ext") else: ax1.set_title(f"WFC3/{detector} {fname} {h1['extname']} ext") - fig.colorbar(im1, ax=ax1,shrink=.75,pad=.03) + fig.colorbar(im1, ax=ax1, shrink=.75, pad=.03) - except (IndexError,KeyError): + except (IndexError, KeyError): if all_pixels: - xstart = 0 - ystart = 0 - xend = naxis1 # full x size - yend = naxis2 # full y size + xstart = 0 + ystart = 0 + xend = naxis1 # full x size + yend = naxis2 # full y size with fits.open(imagename) as hdu: uvis_ext1 = hdu[1].data @@ -212,35 +210,34 @@ def display_image(filename, uvis_ext3 = hdu[3].data try: - uvis_ext1 = uvis_ext1[ystart:yend,xstart:xend] - uvis_ext2 = uvis_ext2[ystart:yend,xstart:xend] - uvis_ext3 = uvis_ext3[ystart:yend,xstart:xend] + uvis_ext1 = uvis_ext1[ystart:yend, xstart:xend] + uvis_ext2 = uvis_ext2[ystart:yend, xstart:xend] + uvis_ext3 = uvis_ext3[ystart:yend, xstart:xend] make1x3plot(scaling, colormaps, uvis_ext1, uvis_ext2, uvis_ext3, xstart, xend, ystart, yend, detector, fname, h1, h2, h3, figsize, dpi) - except (TypeError,IndexError,AttributeError): + except (TypeError, IndexError, AttributeError): - z1_sci, z2_sci = get_scale_limits(scaling[0],uvis_ext1,'SCI') - fig, ax1 = plt.subplots(1,1,figsize=figsize,dpi=dpi) - im1 = ax1.imshow(uvis_ext1,origin='lower',extent=(xstart,xend,ystart,yend),cmap=colormaps[0],vmin=z1_sci, vmax=z2_sci) + z1_sci, z2_sci = get_scale_limits(scaling[0], uvis_ext1, 'SCI') + fig, ax1 = plt.subplots(1, 1, figsize=figsize, dpi=dpi) + im1 = ax1.imshow(uvis_ext1, origin='lower', extent=(xstart, xend, ystart, yend), cmap=colormaps[0], vmin=z1_sci, vmax=z2_sci) if len(fname) > 18: ax1.set_title(f"WFC3/{detector} {fname}\n{h1['extname']} ext") else: ax1.set_title(f"WFC3/{detector} {fname} {h1['extname']} ext") - fig.colorbar(im1, ax=ax1,shrink=.75,pad=.03) - + fig.colorbar(im1, ax=ax1, shrink=.75, pad=.03) if detector == 'IR' and '_ima.fits' not in fname: - if ima_multiread == True: + if ima_multiread is True: sys.exit("keyword argument 'ima_multiread' can only be set to True for 'ima.fits' files") if all_pixels: xstart = 0 ystart = 0 - xend = naxis1 # full x size - yend = naxis2 # full y size + xend = naxis1 # full x size + yend = naxis2 # full y size try: with fits.open(imagename) as hdu: @@ -248,9 +245,9 @@ def display_image(filename, data_err = hdu[2].data data_dq = hdu[3].data - data_sci = data_sci[ystart:yend,xstart:xend] - data_err = data_err[ystart:yend,xstart:xend] - data_dq = data_dq[ystart:yend,xstart:xend] + data_sci = data_sci[ystart:yend, xstart:xend] + data_err = data_err[ystart:yend, xstart:xend] + data_dq = data_dq[ystart:yend, xstart:xend] make1x3plot(scaling, colormaps, data_sci, data_err, data_dq, xstart, xend, ystart, yend, @@ -258,49 +255,48 @@ def display_image(filename, figsize, dpi) except (AttributeError, TypeError, ValueError): - z1_sci, z2_sci = get_scale_limits(scaling[0],data_sci,'SCI') - fig, ax1 = plt.subplots(1,1,figsize=figsize,dpi=dpi) - im1 = ax1.imshow(data_sci,origin='lower',extent=(xstart,xend,ystart,yend),cmap=colormaps[0],vmin=z1_sci, vmax=z2_sci) - if len(fname) > 18: - ax1.set_title(f"WFC3/{detector} {fname}\n{h1['extname']} ext") - else: - ax1.set_title(f"WFC3/{detector} {fname} {h1['extname']} ext") - fig.colorbar(im1, ax=ax1,shrink=.75,pad=.03) - + z1_sci, z2_sci = get_scale_limits(scaling[0], data_sci, 'SCI') + fig, ax1 = plt.subplots(1, 1, figsize=figsize, dpi=dpi) + im1 = ax1.imshow(data_sci, origin='lower', extent=(xstart, xend, ystart, yend), cmap=colormaps[0], vmin=z1_sci, vmax=z2_sci) + if len(fname) > 18: + ax1.set_title(f"WFC3/{detector} {fname}\n{h1['extname']} ext") + else: + ax1.set_title(f"WFC3/{detector} {fname} {h1['extname']} ext") + fig.colorbar(im1, ax=ax1, shrink=.75, pad=.03) if '_ima.fits' in fname: if all_pixels: xstart = 0 ystart = 0 - xend = naxis1 # full x size - yend = naxis2 # full y size + xend = naxis1 # full x size + yend = naxis2 # full y size - if ima_multiread == True: + if ima_multiread is True: nsamps = h['NSAMP'] - for ext in reversed(range(1,nsamps+1)): + for ext in reversed(range(1, nsamps+1)): with fits.open(imagename) as hdu: - data_sci = hdu['SCI',ext].data - data_err = hdu['ERR',ext].data - data_dq = hdu['DQ',ext].data + data_sci = hdu['SCI', ext].data + data_err = hdu['ERR', ext].data + data_dq = hdu['DQ', ext].data - data_sci = data_sci[ystart:yend,xstart:xend] - data_err = data_err[ystart:yend,xstart:xend] - data_dq = data_dq[ystart:yend,xstart:xend] + data_sci = data_sci[ystart:yend, xstart:xend] + data_err = data_err[ystart:yend, xstart:xend] + data_dq = data_dq[ystart:yend, xstart:xend] makeIR1x3plot(scaling, colormaps, data_sci, data_err, data_dq, - xstart, xend, ystart, yend, - detector, fname, h1, h2, h3, nsamps, ext, - figsize, dpi) + xstart, xend, ystart, yend, + detector, fname, h1, h2, h3, nsamps, ext, + figsize, dpi) - if ima_multiread == False: + if ima_multiread is False: with fits.open(imagename) as hdu: - data_sci = hdu['SCI',1].data - data_err = hdu['ERR',1].data - data_dq = hdu['DQ',1].data + data_sci = hdu['SCI', 1].data + data_err = hdu['ERR', 1].data + data_dq = hdu['DQ', 1].data - data_sci = data_sci[ystart:yend,xstart:xend] - data_err = data_err[ystart:yend,xstart:xend] - data_dq = data_dq[ystart:yend,xstart:xend] + data_sci = data_sci[ystart:yend, xstart:xend] + data_err = data_err[ystart:yend, xstart:xend] + data_dq = data_dq[ystart:yend, xstart:xend] make1x3plot(scaling, colormaps, data_sci, data_err, data_dq, xstart, xend, ystart, yend, @@ -308,37 +304,9 @@ def display_image(filename, figsize, dpi) -def get_bunit(ext1header): - """ Get the brightness unit for the plot axis label - - Parameters - ---------- - ext1header: Header - The extension 1 header of the fits file being displayed. This is the - extension that contains the brightness unit keyword - - Returns - ------- - The string of the brightness unit for the axis label - {'counts', 'counts/s','e$^-$', 'e$^-$/s'} - - """ - units = ext1header['bunit'] - - if units == 'COUNTS': - return 'counts' - elif units == 'COUNTS/S': - return 'counts/s' - elif units == 'ELECTRONS': - return 'e$^-$' - elif units == 'ELECTRONS/S': - return 'e$^-$/s' - else: - return units - - def get_scale_limits(scaling, array, extname): - """ Get the scale limits to use for the image extension being displayed + """ + Get the scale limits to use for the image extension being displayed. Parameters ---------- @@ -346,8 +314,8 @@ def get_scale_limits(scaling, array, extname): List of real numbers to act as scalings for the SCI, ERR, and DQ arrays. The first element in the list is for the SCI array the second is for the ERR array and the third element in the list is for the DQ extension. If - no scalings are given the default scaling will use - ginga.util.zscale.zscale(). All three scalings must be provided even if + no scalings are given the default scaling will use astropy.visualization + ZScaleInterval.get_limits(). All three scalings must be provided even if only changing 1-2 scalings. E.g. to change SCI array scaling: scaling = [(5E4,8E4),(None,None),(None,None)] @@ -355,40 +323,42 @@ def get_scale_limits(scaling, array, extname): The ImageHDU array that is being displayed. extname: String {"SCI", "ERR", "DQ"} - The name of the extension of which the scale is being determined + The name of the extension of which the scale is being determined. Returns ------- z1: Float - The minimum value for the image scale + The minimum value for the image scale. z2: Float - The maximum value for the image scale + The maximum value for the image scale. """ + + z = ZScaleInterval() if extname == 'DQ': - if scaling[0] == None and scaling[1] == None: + if scaling[0] is None and scaling[1] is None: z1, z2 = array.min(), array.max() - elif scaling[0] == None and scaling[1] != None: + elif scaling[0] is None and scaling[1] is not None: z1 = array.min() z2 = scaling[1] - elif scaling[0] != None and scaling[1] == None: + elif scaling[0] is not None and scaling[1] is None: z1 = scaling[0] z2 = array.max() - elif scaling[0] != None and scaling[1] != None: + elif scaling[0] is not None and scaling[1] is not None: z1 = scaling[0] z2 = scaling[1] - + elif extname == 'SCI' or extname == 'ERR': - if scaling[0] == None and scaling[1] == None: - z1, z2 = zscale.zscale(array) - elif scaling[0] == None and scaling[1] != None: - z1 = zscale.zscale(array)[0] + if scaling[0] is None and scaling[1] is None: + z1, z2 = z.get_limits(array) + elif scaling[0] is None and scaling[1] is not None: + z1 = z.get_limits(array)[0] z2 = scaling[1] - elif scaling[0] != None and scaling[1] == None: + elif scaling[0] is not None and scaling[1] is None: z1 = scaling[0] - z2 = zscale.zscale(array)[1] - elif scaling[0] != None and scaling[1] != None: + z2 = z.get_limits(array)[1] + elif scaling[0] is not None and scaling[1] is not None: z1 = scaling[0] z2 = scaling[1] else: @@ -401,8 +371,8 @@ def get_scale_limits(scaling, array, extname): def make1x3plot(scaling, colormaps, fullsci, fullerr, fulldq, xstart, xend, ystart, yend, detector, fname, h1, h2, h3, - figsize, dpi): - """ Make a 3 column figure to display any WFC3 image or image section + figsize=(9, 6), dpi=100): + """ Make a 3 column figure to display any WFC3 image or image section. Parameters ---------- @@ -410,8 +380,8 @@ def make1x3plot(scaling, colormaps, fullsci, fullerr, fulldq, List of real numbers to act as scalings for the SCI, ERR, and DQ arrays. The first element in the list is for the SCI array the second is for the ERR array and the third element in the list is for the DQ extension. If - no scalings are given the default scaling will use - ginga.util.zscale.zscale(). All three scalings must be provided even if + no scalings are given the default scaling will use astropy.visualization + ZScaleInterval.get_limits(). All three scalings must be provided even if only changing 1-2 scalings. E.g. to change SCI array scaling: scaling = [(5E4,8E4),(None,None),(None,None)] @@ -451,10 +421,10 @@ def make1x3plot(scaling, colormaps, fullsci, fullerr, fulldq, The ending index value for the y-axis of the image. detector: String {"UVIS", "IR"} - The detector used for the image + The detector used for the image. fname: String - The name of the file being plotted + The name of the file being plotted. h1: Header The extension 1 header of the fits file being displayed. @@ -466,10 +436,10 @@ def make1x3plot(scaling, colormaps, fullsci, fullerr, fulldq, The extension 3 header of the fits file being displayed. figsize: (float,float) - The width, height of the figure. Default is (9,6) + The width, height of the figure. Default is (9,6). dpi: float - The resolution of the figure in dots-per-inch. Default is 100 + The resolution of the figure in dots-per-inch. Default is 100. Returns ------- @@ -477,15 +447,15 @@ def make1x3plot(scaling, colormaps, fullsci, fullerr, fulldq, """ - z1_sci, z2_sci = get_scale_limits(scaling[0],fullsci,'SCI') - z1_err, z2_err = get_scale_limits(scaling[1],fullerr,'ERR') - z1_dq, z2_dq = get_scale_limits(scaling[2],fulldq,'DQ') + z1_sci, z2_sci = get_scale_limits(scaling[0], fullsci, 'SCI') + z1_err, z2_err = get_scale_limits(scaling[1], fullerr, 'ERR') + z1_dq, z2_dq = get_scale_limits(scaling[2], fulldq, 'DQ') - fig, [ax1,ax2,ax3] = plt.subplots(1,3,figsize=figsize,dpi=dpi) + fig, [ax1, ax2, ax3] = plt.subplots(1, 3, figsize=figsize, dpi=dpi) - im1 = ax1.imshow(fullsci,origin='lower',extent=(xstart,xend,ystart,yend),cmap=colormaps[0],vmin=z1_sci, vmax=z2_sci) - im2 = ax2.imshow(fullerr,origin='lower',extent=(xstart,xend,ystart,yend),cmap=colormaps[1],vmin=z1_err, vmax=z2_err) - im3 = ax3.imshow(fulldq, origin='lower',extent=(xstart,xend,ystart,yend),cmap=colormaps[2],vmin=z1_dq, vmax=z2_dq) + im1 = ax1.imshow(fullsci, origin='lower', extent=(xstart, xend, ystart, yend), cmap=colormaps[0], vmin=z1_sci, vmax=z2_sci) + im2 = ax2.imshow(fullerr, origin='lower', extent=(xstart, xend, ystart, yend), cmap=colormaps[1], vmin=z1_err, vmax=z2_err) + im3 = ax3.imshow(fulldq, origin='lower', extent=(xstart, xend, ystart, yend), cmap=colormaps[2], vmin=z1_dq, vmax=z2_dq) if len(fname) > 18: ax1.set_title(f"WFC3/{detector} {fname}\n{h1['extname']} ext") @@ -495,15 +465,16 @@ def make1x3plot(scaling, colormaps, fullsci, fullerr, fulldq, ax1.set_title(f"WFC3/{detector} {fname} {h1['extname']} ext") ax2.set_title(f"WFC3/{detector} {fname} {h2['extname']} ext") ax3.set_title(f"WFC3/{detector} {fname} {h3['extname']} ext") - fig.colorbar(im1, ax=ax1,shrink=.25,pad=.03) - fig.colorbar(im2, ax=ax2,shrink=.25,pad=.03) - fig.colorbar(im3, ax=ax3,shrink=.25,pad=.03) + fig.colorbar(im1, ax=ax1, shrink=.25, pad=.03) + fig.colorbar(im2, ax=ax2, shrink=.25, pad=.03) + fig.colorbar(im3, ax=ax3, shrink=.25, pad=.03) + def makeIR1x3plot(scaling, colormaps, data_sci, data_err, data_dq, xstart, xend, ystart, yend, detector, fname, h1, h2, h3, nsamps, ext, - figsize, dpi): - """ Make a 3 column figure to display any WFC3 IMA image or image section + figsize=(9, 6), dpi=100): + """ Make a 3 column figure to display any WFC3 IMA image or image section. Parameters ---------- @@ -511,8 +482,8 @@ def makeIR1x3plot(scaling, colormaps, data_sci, data_err, data_dq, List of real numbers to act as scalings for the SCI, ERR, and DQ arrays. The first element in the list is for the SCI array the second is for the ERR array and the third element in the list is for the DQ extension. If - no scalings are given the default scaling will use - ginga.util.zscale.zscale(). All three scalings must be provided even if + no scalings are given the default scaling will use astropy.visualization + ZScaleInterval.get_limits(). All three scalings must be provided even if only changing 1-2 scalings. E.g. to change SCI array scaling: scaling = [(5E4,8E4),(None,None),(None,None)] @@ -552,10 +523,10 @@ def makeIR1x3plot(scaling, colormaps, data_sci, data_err, data_dq, The ending index value for the y-axis of the image. detector: String {"UVIS", "IR"} - The detector used for the image + The detector used for the image. fname: String - The name of the file being plotted + The name of the file being plotted. h1: Header The extension 1 header of the fits file being displayed. @@ -567,16 +538,16 @@ def makeIR1x3plot(scaling, colormaps, data_sci, data_err, data_dq, The extension 3 header of the fits file being displayed. nsamps: Integer - The number of samples (readouts) contained in the file + The number of samples (readouts) contained in the file. ext: Integer - The extension to be displayed. Ranges from 1 to nsamp + The extension to be displayed. Ranges from 1 to nsamp. figsize: (float,float) - The width, height of the figure. Default is (9,6) + The width, height of the figure. Default is (9,6). dpi: float - The resolution of the figure in dots-per-inch. Default is 100 + The resolution of the figure in dots-per-inch. Default is 100. Returns ------- @@ -584,17 +555,17 @@ def makeIR1x3plot(scaling, colormaps, data_sci, data_err, data_dq, """ - z1_sci, z2_sci = get_scale_limits(scaling[0],data_sci,'SCI') - z1_err, z2_err = get_scale_limits(scaling[1],data_err,'ERR') - z1_dq, z2_dq = get_scale_limits(scaling[2],data_dq,'DQ') - - fig, [ax1,ax2,ax3] = plt.subplots(1,3,figsize = figsize,dpi=dpi) - im1 = ax1.imshow(data_sci,origin='lower',extent=(xstart,xend,ystart,yend),cmap=colormaps[0],vmin=z1_sci, vmax=z2_sci) - im2 = ax2.imshow(data_err,origin='lower',extent=(xstart,xend,ystart,yend),cmap=colormaps[1],vmin=z1_err, vmax=z2_err) - im3 = ax3.imshow(data_dq, origin='lower',extent=(xstart,xend,ystart,yend),cmap=colormaps[2],vmin=z1_dq, vmax=z2_dq) - fig.colorbar(im1, ax=ax1,shrink=.25,pad=.03) - fig.colorbar(im2, ax=ax2,shrink=.25,pad=.03) - fig.colorbar(im3, ax=ax3,shrink=.25,pad=.03) + z1_sci, z2_sci = get_scale_limits(scaling[0], data_sci, 'SCI') + z1_err, z2_err = get_scale_limits(scaling[1], data_err, 'ERR') + z1_dq, z2_dq = get_scale_limits(scaling[2], data_dq, 'DQ') + + fig, [ax1, ax2, ax3] = plt.subplots(1, 3, figsize=figsize, dpi=dpi) + im1 = ax1.imshow(data_sci, origin='lower', extent=(xstart, xend, ystart, yend), cmap=colormaps[0], vmin=z1_sci, vmax=z2_sci) + im2 = ax2.imshow(data_err, origin='lower', extent=(xstart, xend, ystart, yend), cmap=colormaps[1], vmin=z1_err, vmax=z2_err) + im3 = ax3.imshow(data_dq, origin='lower', extent=(xstart, xend, ystart, yend), cmap=colormaps[2], vmin=z1_dq, vmax=z2_dq) + fig.colorbar(im1, ax=ax1, shrink=.25, pad=.03) + fig.colorbar(im2, ax=ax2, shrink=.25, pad=.03) + fig.colorbar(im3, ax=ax3, shrink=.25, pad=.03) if len(fname) > 18: ax1.set_title(f"WFC3/{detector} {fname}\n {h1['extname']} read {(nsamps+1)-ext}") diff --git a/notebooks/WFC3/image_displayer_analyzer/requirements.txt b/notebooks/WFC3/image_displayer_analyzer/requirements.txt index a624b7053..65f7c3c7c 100644 --- a/notebooks/WFC3/image_displayer_analyzer/requirements.txt +++ b/notebooks/WFC3/image_displayer_analyzer/requirements.txt @@ -1,2 +1,6 @@ -astroquery==0.4.6 -matplotlib==3.7.0 +astropy +astroquery +jupyter +matplotlib +numpy +scipy diff --git a/notebooks/WFC3/image_displayer_analyzer/row_column_stats.py b/notebooks/WFC3/image_displayer_analyzer/row_column_stats.py index da71d38e3..d26b619fc 100755 --- a/notebooks/WFC3/image_displayer_analyzer/row_column_stats.py +++ b/notebooks/WFC3/image_displayer_analyzer/row_column_stats.py @@ -1,5 +1,4 @@ #! /usr/bin/env python - import sys import numpy as np @@ -7,6 +6,7 @@ import matplotlib.pyplot as plt from scipy.stats import mode as mode + def get_bunit(ext1header): """ Get the brightness unit for the plot axis label @@ -35,6 +35,7 @@ def get_bunit(ext1header): else: return units + def get_yaxis_and_label(stat, scidata, axes): """ Get the y-axis values and the y axis label for the plot @@ -78,8 +79,9 @@ def get_yaxis_and_label(stat, scidata, axes): return yaxis, ylabel + def makeplot(xaxis, yaxis, axlabel, ylabel, - bunit,detector, fname, h1, ylim, + bunit, detector, fname, h1, ylim, figsize, dpi): """ Make and display the plot for WFC3 UVIS or IR images @@ -121,26 +123,26 @@ def makeplot(xaxis, yaxis, axlabel, ylabel, N/A """ - fig, ax1 = plt.subplots(1, 1, figsize = figsize, dpi=dpi) - # ax1.scatter(xaxis,yaxis,10,alpha=0.75) - ax1.plot(xaxis,yaxis,marker='o',markersize=5,ls='-',alpha=0.75) + fig, ax1 = plt.subplots(1, 1, figsize=figsize, dpi=dpi) + ax1.plot(xaxis, yaxis, marker='o', markersize=5, ls='-', alpha=0.75) - ax1.set_xlabel(f"{axlabel} Number",size=13) - ax1.set_ylabel(f"{axlabel} {ylabel} [{bunit}]",size=13) + ax1.set_xlabel(f"{axlabel} Number", size=13) + ax1.set_ylabel(f"{axlabel} {ylabel} [{bunit}]", size=13) ax1.grid(alpha=.75) ax1.minorticks_on() - ax1.yaxis.set_ticks_position('both'),ax1.xaxis.set_ticks_position('both') - ax1.tick_params(axis='both',which='minor',direction='in',labelsize = 12,length=4) - ax1.tick_params(axis='both',which='major',direction='in',labelsize = 12,length=7) + ax1.yaxis.set_ticks_position('both'), ax1.xaxis.set_ticks_position('both') + ax1.tick_params(axis='both', which='minor', direction='in', labelsize=12, length=4) + ax1.tick_params(axis='both', which='major', direction='in', labelsize=12, length=7) if len(fname) > 18: - ax1.set_title(f"WFC3/{detector} {fname}\n {h1['extname']} ext",size=14) + ax1.set_title(f"WFC3/{detector} {fname}\n {h1['extname']} ext", size=14) else: - ax1.set_title(f"WFC3/{detector} {fname} {h1['extname']} ext",size=14) - if ylim != None: - ax1.set_ylim(ylim[0],ylim[1]) + ax1.set_title(f"WFC3/{detector} {fname} {h1['extname']} ext", size=14) + if ylim is not None: + ax1.set_ylim(ylim[0], ylim[1]) + def make_ima_plot(xaxis, yaxis, axlabel, ylabel, - bunit, detector, fname,h1, ylim, nsamps, ext, + bunit, detector, fname, h1, ylim, nsamps, ext, figsize, dpi): """ Make and display the plot for WFC3 IR IMA images @@ -188,27 +190,27 @@ def make_ima_plot(xaxis, yaxis, axlabel, ylabel, N/A """ - fig, ax1 = plt.subplots(1,1,figsize=figsize,dpi=dpi) - # ax1.scatter(xaxis,yaxis,10,alpha=0.75) - ax1.plot(xaxis,yaxis,marker='o',markersize=5,ls='-',alpha=0.75) + fig, ax1 = plt.subplots(1, 1, figsize=figsize, dpi=dpi) + ax1.plot(xaxis, yaxis, marker='o', markersize=5, ls='-', alpha=0.75) - ax1.set_xlabel(f"{axlabel} Number",size=13) - ax1.set_ylabel(f"{axlabel} {ylabel} [{bunit}]",size=13) + ax1.set_xlabel(f"{axlabel} Number", size=13) + ax1.set_ylabel(f"{axlabel} {ylabel} [{bunit}]", size=13) ax1.grid(alpha=.75) ax1.minorticks_on() - ax1.yaxis.set_ticks_position('both'),ax1.xaxis.set_ticks_position('both') - ax1.tick_params(axis='both',which='minor',direction='in',labelsize = 12,length=4) - ax1.tick_params(axis='both',which='major',direction='in',labelsize = 12,length=7) + ax1.yaxis.set_ticks_position('both'), ax1.xaxis.set_ticks_position('both') + ax1.tick_params(axis='both', which='minor', direction='in', labelsize=12, length=4) + ax1.tick_params(axis='both', which='major', direction='in', labelsize=12, length=7) if len(fname) > 18: - ax1.set_title(f"WFC3/{detector} {fname}\n {h1['extname']} read {(nsamps+1)-ext}",size=14) + ax1.set_title(f"WFC3/{detector} {fname}\n {h1['extname']} read {(nsamps+1)-ext}", size=14) else: - ax1.set_title(f"WFC3/{detector} {fname} {h1['extname']} read {(nsamps+1)-ext}",size=14) - if ylim != None: - ax1.set_ylim(ylim[0],ylim[1]) + ax1.set_title(f"WFC3/{detector} {fname} {h1['extname']} read {(nsamps+1)-ext}", size=14) + if ylim is not None: + ax1.set_ylim(ylim[0], ylim[1]) -def row_column_stats(filename, stat='median', axis='column', ylim=(None,None), + +def row_column_stats(filename, stat='median', axis='column', ylim=(None, None), printmeta=False, ima_multiread=False, plot=True, - figsize=(9,6), dpi=120): + figsize=(9, 6), dpi=120): """ A function to plot the column median vs column number for the 'SCI' data of any WFC3 fits image. @@ -296,7 +298,7 @@ def row_column_stats(filename, stat='median', axis='column', ylim=(None,None), print("Invalid image section specified") return 0, 0 try: - xstart = int(xsec[: xs]) + xstart = int(xsec[:xs]) except ValueError: print("Problem getting xstart") return @@ -351,69 +353,66 @@ def row_column_stats(filename, stat='median', axis='column', ylim=(None,None), sys.exit("keyword argument 'axis' must be set to 'column' or 'row' ") if printmeta: - print(f"\t{si}/{detector} {fname} ") + print(f"\t{si}/{detector} {fname}") print('-'*44) print(f"Filter = {h['filter']}, Date-Obs = {h['date-obs']} T{h['time-obs']},\nTarget = {h['targname']}, Exptime = {h['exptime']}, Subarray = {issubarray}, Units = {h1['bunit']}\n") - - if detector == 'UVIS': - if ima_multiread == True: + if ima_multiread is True: sys.exit("keyword argument 'ima_multiread' can only be set to True for 'ima.fits' files") try: with fits.open(imagename) as hdu: - uvis1_sci = hdu['SCI',2].data - uvis2_sci = hdu['SCI',1].data + uvis1_sci = hdu['SCI', 2].data + uvis2_sci = hdu['SCI', 1].data - uvis_sci = np.concatenate([uvis2_sci,uvis1_sci]) + uvis_sci = np.concatenate([uvis2_sci, uvis1_sci]) if all_pixels: - uvis_sci = uvis_sci[ystart:yend*2,xstart:xend] + uvis_sci = uvis_sci[ystart:yend*2, xstart:xend] if axis == 'row': xaxis = range(ystart, yend*2) else: - uvis_sci = uvis_sci[ystart:yend,xstart:xend] + uvis_sci = uvis_sci[ystart:yend, xstart:xend] except KeyError: with fits.open(imagename) as hdu: - uvis_sci = hdu['SCI',1].data + uvis_sci = hdu['SCI', 1].data if all_pixels: - uvis_sci = uvis_sci[ystart:yend,xstart:xend] + uvis_sci = uvis_sci[ystart:yend, xstart:xend] if axis == 'row': xaxis = range(ystart, yend) else: - uvis_sci = uvis_sci[ystart:yend,xstart:xend] + uvis_sci = uvis_sci[ystart:yend, xstart:xend] - yaxis, ylabel = get_yaxis_and_label(stat,uvis_sci,axes) + yaxis, ylabel = get_yaxis_and_label(stat, uvis_sci, axes) if plot: makeplot(xaxis, yaxis, axlabel, ylabel, bunit, detector, fname, h1, ylim, figsize, dpi) - if detector == 'IR': - if ima_multiread == True: + if ima_multiread is True: nsamps = fits.getheader(imagename)['NSAMP'] - for ext in reversed(range(1,nsamps+1)): + for ext in reversed(range(1, nsamps+1)): with fits.open(imagename) as hdu: - scidata = hdu['SCI',ext].data + scidata = hdu['SCI', ext].data - scidata = scidata[ystart:yend,xstart:xend] + scidata = scidata[ystart:yend, xstart:xend] - yaxis, ylabel = get_yaxis_and_label(stat,scidata,axes) + yaxis, ylabel = get_yaxis_and_label(stat, scidata, axes) if plot: make_ima_plot(xaxis, yaxis, axlabel, ylabel, - bunit, detector, fname,h1, ylim, + bunit, detector, fname, h1, ylim, nsamps, ext, figsize, dpi) - if ima_multiread == False: + if ima_multiread is False: with fits.open(imagename) as hdu: - scidata = hdu['SCI',1].data + scidata = hdu['SCI', 1].data - scidata = scidata[ystart:yend,xstart:xend] + scidata = scidata[ystart:yend, xstart:xend] - yaxis, ylabel = get_yaxis_and_label(stat,scidata,axes) + yaxis, ylabel = get_yaxis_and_label(stat, scidata, axes) if plot: makeplot(xaxis, yaxis, axlabel, ylabel, bunit, detector, fname, h1, diff --git a/notebooks/WFC3/image_displayer_analyzer/wfc3_image_displayer_analyzer.ipynb b/notebooks/WFC3/image_displayer_analyzer/wfc3_image_displayer_analyzer.ipynb index aa95e3e0b..06ed259e1 100755 --- a/notebooks/WFC3/image_displayer_analyzer/wfc3_image_displayer_analyzer.ipynb +++ b/notebooks/WFC3/image_displayer_analyzer/wfc3_image_displayer_analyzer.ipynb @@ -8,6 +8,8 @@ "# WFC3 Image Displayer & Analyzer \n", "***\n", "## Learning Goals\n", + "This notebook provides a method to quickly display images from the Hubble Space Telescope’s Wide Field
\n", + "Camera 3 (WFC3) instrument. This tool also allows the user to derive statistics by row or column in the image.
\n", "\n", "By the end of this tutorial, you will:\n", "\n", @@ -15,14 +17,11 @@ "- Learn how to use the `display_image` tool to display any WFC3 fits file.\n", "- Learn how to use the `row_column_stats` tool to plot row or column statistics for any WFC3 fits file.\n", "\n", - "This notebook provides a method to quickly display images from the Hubble Space Telescope’s Wide Field Camera 3 (WFC3) instrument. This tool also allows the user to derive statistics by row or column in the image.
\n", - "Please make sure you have read the `README.md` file before continuing. \n", - "\n", "## Table of Contents\n", " [Introduction](#intro)
\n", " [1. Imports](#imports)
\n", " [2. Query MAST and download a WFC3 `flt.fits` and `ima.fits` image](#download)
\n", - " [3. display_image](#display)
\n", + " [3. `display_image`](#display)
\n", "       [3.1 Display the full images with metadata](#displayfull)
\n", "       [3.2 Display UVIS1 & UVIS2 separately](#perchip)
\n", "       [3.3 Display an image section and change the `SCI` array colormap](#colormap)
\n", @@ -81,17 +80,18 @@ "\n", "## 1. Imports\n", "\n", - "This notebook assumes you have created the virtual environment in [WFC3 Library's](https://github.com/spacetelescope/WFC3Library) installation instructions.\n", + "
This notebook assumes you have created and activated a virtual environment using the \n", + " requirements file in this notebook's repository. Please make sure you have read the contents of the README file before continuing the notebook.
\n", "\n", "We import:
\n", - "
\n", - "**•** *os* for setting environment variables
\n", - " \n", - "**•** *astroquery.mast Observations* for downloading data from MAST
\n", - "**•** *matplotlib.pyplot* for plotting data\n", - " \n", - "**•** *display_image* for displaying a WFC3 image
\n", - "**•** *row_column_stats* for computing statistics on a WFC3 image" + "\n", + "| Package Name | Purpose |\n", + "|:--------------------------------|:-------------------------------------|\n", + "| `os` | setting environment variables |\n", + "| `astroquery.mast.Observations` | downloading data from MAST |\n", + "| `matplotlib.pyplot` | plotting data |\n", + "| `display_image` | displaying any WFC3 image |\n", + "| `row_column_stats` | computing statistics on a WFC3 image |" ] }, { @@ -119,11 +119,12 @@ "## 2. Query MAST and download a WFC3 `flt.fits` and `ima.fits` image \n", "You may download the data from MAST using either the [Archive Search Engine](https://archive.stsci.edu/hst/search.php) or the [MAST Portal](https://mast.stsci.edu/portal/Mashup/Clients/Mast/Portal.html).\n", "\n", - "Here, we download our images via `astroquery`. For more information, please look at the documentation for [Astroquery](https://astroquery.readthedocs.io/en/latest/),\n", - "[Astroquery.mast](https://astroquery.readthedocs.io/en/latest/mast/mast.html), and \n", + "Here, we download our images via `astroquery`. For more information, please look at the documentation for
\n", + "[Astroquery](https://astroquery.readthedocs.io/en/latest/), [Astroquery.mast](https://astroquery.readthedocs.io/en/latest/mast/mast.html), and \n", "[CAOM Field Descriptions](https://mast.stsci.edu/api/v0/_c_a_o_mfields.html), which is used for the `obs_table` variable.\n", "\n", - "We download images of N2525 from proposal 15145, one being a `flt.fits` in WFC3/UVIS and the other being a `ima.fits` in WFC3/IR. After downloading the images, we move them to our current working directory (cwd)." + "We download images of N2525 from proposal 15145, one being a `flt.fits` in WFC3/UVIS and the other
\n", + "being a `ima.fits` in WFC3/IR. After downloading the images, we move them to our current working directory (cwd)." ] }, { @@ -132,8 +133,8 @@ "metadata": {}, "outputs": [], "source": [ - "query = [('idgga5*','UVIS','FLT',(2,3)),\n", - " ('i*','IR','IMA',(10,11))]\n", + "query = [('idgga5*', 'UVIS', 'FLT', (2, 3)),\n", + " ('i*', 'IR', 'IMA', (10, 11))]\n", "\n", "for criteria in query:\n", " # Get the observation records\n", @@ -158,8 +159,7 @@ " os.rmdir('mastDownload/HST/'+filename[:9])\n", " \n", " os.rmdir('mastDownload/HST/')\n", - " os.rmdir('mastDownload/')\n", - " " + " os.rmdir('mastDownload/')" ] }, { @@ -184,7 +184,9 @@ "source": [ "\n", "## 3. `display_image`\n", - "In this section, we demonstrate the functionality of `display_image`, a useful tool for quickly analyzing WFC3 images. The subsections explain how to display full images with metadata, individual WFC3/UVIS chip images, a section of an image with various colormaps/scaling, and individual WFC3/IR `ima` reads." + "In this section, we demonstrate the functionality of `display_image`, a useful tool for quickly analyzing WFC3 images.
\n", + "The subsections explain how to display full images with metadata, individual WFC3/UVIS chip images, a section of an
\n", + "image with various colormaps/scaling, and individual WFC3/IR `ima` reads." ] }, { @@ -193,7 +195,10 @@ "source": [ "\n", "### 3.1 Display the full images with metadata\n", - "First, we display the `SCI`, `ERR`, and `DQ` arrays for each image and print header info. The default value for `printmeta` is `False`. In the cell below, we set the keyword `printmeta` to `True` to print useful information from the header of the file to the screen. The WFC3/UVIS image is in electrons and the WFC3/IR image is in electrons/second. See Section 2.2.3 of the [WFC3 Data Handbook](https://hst-docs.stsci.edu/wfc3dhb) for full descriptions of `SCI`, `ERR`, and `DQ` arrays." + "First, we display the `SCI`, `ERR`, and `DQ` arrays for each image and print header info. The default value for `printmeta`
\n", + "is `False`. In the cell below, we set the keyword `printmeta` to `True` to print useful information from the header of the
\n", + "file to the screen. The WFC3/UVIS image is in electrons and the WFC3/IR image is in electrons/second. See Section 2.2.3
\n", + "of the [WFC3 Data Handbook](https://hst-docs.stsci.edu/wfc3dhb) for full descriptions of `SCI`, `ERR`, and `DQ` arrays." ] }, { @@ -202,8 +207,8 @@ "metadata": {}, "outputs": [], "source": [ - "display_image('idgga5m1q_flt.fits',printmeta=True)\n", - "display_image('idggabk1q_ima.fits',printmeta=True)" + "display_image('idgga5m1q_flt.fits', printmeta=True)\n", + "display_image('idggabk1q_ima.fits', printmeta=True)" ] }, { @@ -212,8 +217,9 @@ "source": [ "\n", "### 3.2 Display UVIS1 & UVIS2 separately\n", - "Next, we display the WFC3/UVIS chips separately. To select a section of an image, append [xstart:xend,ystart:yend] to the image name as shown below.
\n", - "Notice how we need to specify the full axis range `[0:4096,0:2051]` and not simply just `[:,:2051]` as in standard `numpy` notation." + "Next, we display the WFC3/UVIS chips separately. To select a section of an image, append [xstart:xend,ystart:yend] to the
\n", + "image name as shown below. Notice how we need to specify the full axis range `[0:4096,0:2051]` and not simply just
\n", + "`[:,:2051]` as in standard `numpy` notation." ] }, { @@ -222,7 +228,7 @@ "metadata": {}, "outputs": [], "source": [ - "print ('Display UVIS1')\n", + "print('Display UVIS1')\n", "display_image('idgga5m1q_flt.fits[0:4096,2051:4102]') " ] }, @@ -232,7 +238,7 @@ "metadata": {}, "outputs": [], "source": [ - "print ('Display UVIS2')\n", + "print('Display UVIS2')\n", "display_image('idgga5m1q_flt.fits[0:4096,0:2051]') " ] }, @@ -242,7 +248,8 @@ "source": [ "\n", "### 3.3 Display an image section and change the `SCI` array colormap\n", - "Then, we display `SCI` arrays with a different colormap. Regardless of how many colormaps are being changed, all three colormaps must be provided. The elements of `colormaps` sequentially correspond with the `SCI`, `ERR`, and `DQ` arrays." + "Then, we display `SCI` arrays with a different colormap. Regardless of how many colormaps are being changed, all three
\n", + "colormaps must be provided. The elements of `colormaps` sequentially correspond with the `SCI`, `ERR`, and `DQ` arrays." ] }, { @@ -252,7 +259,7 @@ "outputs": [], "source": [ "display_image('idgga5m1q_flt.fits[3420:3575,2590:2770]',\n", - " colormaps = [\"viridis\",\"Greys_r\",\"inferno_r\"])" + " colormaps=[\"viridis\", \"Greys_r\", \"inferno_r\"])" ] }, { @@ -261,7 +268,10 @@ "source": [ "\n", "### 3.4 Change the scaling of the `SCI` and `ERR` arrays\n", - "Now, we change the scaling of the `SCI` and `ERR` arrays. Regardless of how many scalings are being changed, all three scalings must be provided. The elements of `scaling` sequentially correspond with the `SCI`, `ERR`, and `DQ` arrays. The default scaling value of `None` uses `ginga.util.zscale.zscale()` for scaling (see [documentation](https://ginga.readthedocs.io/en/stable/) for more information). " + "Now, we change the scaling of the `SCI` and `ERR` arrays. Regardless of how many scalings are being changed, all three
\n", + "scalings must be provided. The elements of `scaling` sequentially correspond with the `SCI`, `ERR`, and `DQ` arrays.
\n", + "The default scaling value of `None` uses `astropy.visualization.ZScaleInterval()` for scaling
\n", + "(see [documentation](https://docs.astropy.org/en/stable/api/astropy.visualization.ZScaleInterval.html) for more information). " ] }, { @@ -271,8 +281,8 @@ "outputs": [], "source": [ "display_image('idgga5m1q_flt.fits[3420:3575,2590:2770]',\n", - " colormaps = [\"viridis\",\"viridis\",\"inferno_r\"],\n", - " scaling = [(50000,80000),(None,400),(None,None)])" + " colormaps=[\"viridis\", \"viridis\", \"inferno_r\"],\n", + " scaling=[(50000, 80000), (None, 400), (None, None)])" ] }, { @@ -291,8 +301,8 @@ "outputs": [], "source": [ "display_image('idggabk1q_ima.fits[43:55,299:311]',\n", - " colormaps = [\"viridis\",\"Greys_r\",\"inferno_r\"],\n", - " scaling=[(2,18),(None,None),(None,None)],\n", + " colormaps=[\"viridis\", \"Greys_r\", \"inferno_r\"],\n", + " scaling=[(2, 18), (None, None), (None, None)],\n", " ima_multiread=True)" ] }, @@ -302,7 +312,9 @@ "source": [ "\n", "## 4. `row_column_stats`\n", - "In this section, we demonstrate the functionality of `row_column_stats`, a useful tool for quickly computing WFC3 statistics. The subsections explain how to compute row and column statistics for a full image, individual WFC3/UVIS chips, a section of an image, and individual `ima` reads. The row/column numbers are on the x-axis and the statistics are on the y-axis." + "In this section, we demonstrate the functionality of `row_column_stats`, a useful tool for quickly computing WFC3 statistics.
\n", + "The subsections explain how to compute row and column statistics for a full image, individual WFC3/UVIS chips, a section of
\n", + "an image, and individual `ima` reads. The row/column numbers are on the x-axis and the statistics are on the y-axis." ] }, { @@ -321,14 +333,14 @@ "outputs": [], "source": [ "# plot column median for the full image\n", - "x,y = row_column_stats('idgga5m1q_flt.fits',\n", - " stat='median',\n", - " axis='column')\n", + "x, y = row_column_stats('idgga5m1q_flt.fits',\n", + " stat='median',\n", + " axis='column')\n", "\n", "# plot column standard deviation for the full image\n", - "x,y = row_column_stats('idgga5m1q_flt.fits',\n", - " stat='stddev',\n", - " axis='column')" + "x, y = row_column_stats('idgga5m1q_flt.fits',\n", + " stat='stddev',\n", + " axis='column')" ] }, { @@ -348,22 +360,22 @@ "outputs": [], "source": [ "# get column median values for UVIS2 but don't plot \n", - "x2,y2 = row_column_stats('idgga5m1q_flt.fits[0:4096,0:2051]',\n", - " stat='median',\n", - " axis='column',\n", - " plot=False)\n", + "x2, y2 = row_column_stats('idgga5m1q_flt.fits[0:4096,0:2051]',\n", + " stat='median',\n", + " axis='column',\n", + " plot=False)\n", "\n", "# get column median values for UVIS1 but don't plot \n", - "x1,y1 = row_column_stats('idgga5m1q_flt.fits[0:4096,2051:4102]',\n", - " stat='median',\n", - " axis='column',\n", - " plot=False)\n", + "x1, y1 = row_column_stats('idgga5m1q_flt.fits[0:4096,2051:4102]',\n", + " stat='median',\n", + " axis='column',\n", + " plot=False)\n", "\n", "# overplot UVIS1 and UVIS2 data on one figure \n", - "plt.figure(figsize=(8,6),dpi=130)\n", + "plt.figure(figsize=(8, 6), dpi=130)\n", "plt.grid(alpha=.5)\n", - "plt.plot(x1,y1,marker='.',label='UVIS 1',color='k')\n", - "plt.plot(x2,y2,marker='.',label='UVIS 2',color='C3')\n", + "plt.plot(x1, y1, marker='.', label='UVIS 1', color='k')\n", + "plt.plot(x2, y2, marker='.', label='UVIS 2', color='C3')\n", "plt.title('WFC3/UVIS idgga5m1q_flt.fits')\n", "plt.xlabel('Column Number')\n", "plt.ylabel('Column Median [e-]')\n", @@ -390,14 +402,14 @@ "display_image('idgga5m1q_flt.fits[3420:3575,2590:2770]')\n", "\n", "# plot row mean for a section of the image\n", - "x,y= row_column_stats('idgga5m1q_flt.fits[3420:3575,2590:2770]',\n", - " stat='mean',\n", - " axis='row')\n", + "x, y = row_column_stats('idgga5m1q_flt.fits[3420:3575,2590:2770]',\n", + " stat='mean',\n", + " axis='row')\n", "\n", "# plot column mean for a section of the image\n", - "x,y= row_column_stats('idgga5m1q_flt.fits[3420:3575,2590:2770]',\n", - " stat='mean',\n", - " axis='column')" + "x, y = row_column_stats('idgga5m1q_flt.fits[3420:3575,2590:2770]',\n", + " stat='mean',\n", + " axis='column')" ] }, { @@ -421,16 +433,16 @@ "display_image('idgga5m1q_flt.fits[3420:3575,2590:2770]')\n", "\n", "# plot row mean for single source with custom yaxis limits\n", - "x,y= row_column_stats('idgga5m1q_flt.fits[3420:3575,2590:2770]',\n", - " stat='mean',\n", - " axis='row',\n", - " ylim=(y1,y2))\n", + "x, y = row_column_stats('idgga5m1q_flt.fits[3420:3575,2590:2770]',\n", + " stat='mean',\n", + " axis='row',\n", + " ylim=(y1, y2))\n", "\n", "# plot column mean for single source with custom yaxis limits\n", - "x,y= row_column_stats('idgga5m1q_flt.fits[3420:3575,2590:2770]',\n", - " stat='mean',\n", - " axis='column',\n", - " ylim=(y1,y2))" + "x, y = row_column_stats('idgga5m1q_flt.fits[3420:3575,2590:2770]',\n", + " stat='mean',\n", + " axis='column',\n", + " ylim=(y1, y2))" ] }, { @@ -453,10 +465,10 @@ "display_image('idggabk1q_ima.fits[43:55,299:311]')\n", "\n", "# plot column mean for section of ima\n", - "x,y = row_column_stats('idggabk1q_ima.fits[43:55,299:311]',\n", - " stat='mean',\n", - " axis='column',\n", - " ima_multiread=True)" + "x, y = row_column_stats('idggabk1q_ima.fits[43:55,299:311]',\n", + " stat='mean',\n", + " axis='column',\n", + " ima_multiread=True)" ] }, { @@ -510,18 +522,11 @@ "[Top of Page](#top)\n", "\"Space " ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [] } ], "metadata": { "kernelspec": { - "display_name": "Python 3", + "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, @@ -535,7 +540,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.7.5" + "version": "3.11.5" } }, "nbformat": 4, From d4e2c31a9efd8bdb0a5475ba21679bd75c66e4e8 Mon Sep 17 00:00:00 2001 From: dulude Date: Mon, 11 Dec 2023 12:02:08 -0500 Subject: [PATCH 25/30] STIS/calstis_2d_ccd.ipynb: fixed two images. --- notebooks/STIS/calstis/calstis_2d_ccd.ipynb | 11 +++-------- 1 file changed, 3 insertions(+), 8 deletions(-) diff --git a/notebooks/STIS/calstis/calstis_2d_ccd.ipynb b/notebooks/STIS/calstis/calstis_2d_ccd.ipynb index bd9b3056b..c0d834deb 100644 --- a/notebooks/STIS/calstis/calstis_2d_ccd.ipynb +++ b/notebooks/STIS/calstis/calstis_2d_ccd.ipynb @@ -367,11 +367,8 @@ "The BLEVCORR step is part of basic 2-D image reduction for CCD data only. This step subtracts the electronic bias level for each line of the CCD image and trims the overscan regions off of the input image, leaving only the exposed portions of the image. \n", "\n", "Because the electronic bias level can vary with time and temperature, its value is determined from the overscan region in the particular exposure being processed. This bias is applied equally to real pixels (main detector and physical overscan) and the virtual overscan region (pixels that don't actually exist, but are recorded when the detector clocks out extra times after reading out all the parallel rows). A raw STIS CCD image in full frame unbinned mode has 19 leading and trailing columns of serial physical overscan in the AXIS1 (x direction), and 20 rows of virtual overscan in the AXIS2 (y direction); therefore the size of the uncalibrated and unbinned full framge CCD image is 1062(serial) $\\times$ 1044(parallel) pixels, with 1024 * 1024 exposed science pixels.\n", - "\n", "\n", - "![Graph illustrating parallel serial overscan corresponding to wavelength in the x-axis and virtual overscan corresponding to position along slit in the y-axis.](figures/CCD_overscan.jpg \"Graph illustrating parallel serial overscan corresponding to wavelength in the x-axis and virtual overscan corresponding to position along slit in the y-axis.\")" + "\"Graph" ] }, { @@ -382,11 +379,9 @@ "The electronic bias level is subtracted line-by-line. An initial value of electronic bias level is determined for each line of the image using the serial and parallel overscans, and a straight line is fitted to the bias as a function of image line. The intial electronic bias for each line is determined by taking the median of a predetermined subset of the trailing serial overscan pixels, which currently includes most trailing overscan region except the first and last three pixels, and pixels flagged with bad data quality flags. The actual overscan bias subtracted from the image is the value of the linear fit at a specific image line. The mean value of all overscan levels is written to the output SCI extension header as MEANBLEV.\n", "\n", "THE BLEVCORR step also trims the image of overscan. The size of the overscan regions depend on binning and whether the image if full frame or a subimage, and the locations of the overscan regions depend on which amplifier was used for readout. The number of pixels trimmed during CCD bias level correction on each side is given in the following table.\n", - "\n", "\n", - "![The number of pixels trimmed during CCD bias level correction on each side](figures/pixels_trimmed.jpg)\n" + "\"The\n", + "\n" ] }, { From 6c8512564c3bf9960d4906a32e8f08a90c8b953b Mon Sep 17 00:00:00 2001 From: dulude Date: Mon, 11 Dec 2023 12:07:35 -0500 Subject: [PATCH 26/30] WFC3/calwf3_with_v1.0_PCTE.ipynb: fixed three images. --- .../WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb | 11 +++-------- 1 file changed, 3 insertions(+), 8 deletions(-) diff --git a/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb b/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb index 14d358b22..26ea2efa3 100644 --- a/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb +++ b/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb @@ -683,9 +683,7 @@ "source": [ "
Animated GIF of the v1.0 and v2.0 FLC image subsections:
\n", "\n", - "\n", - "\n", - "![An animated gif blinking between a subsection of background sky using the v1.0 and V2.0 pixel-based CTE corrections. The v2.0 background appears smoother with less noise and pixel variations.](example/v1_v2_bkg.gif \"An animated gif blinking between a subsection of background sky using the v1.0 and V2.0 pixel-based CTE corrections. The v2.0 background appears smoother with less noise and pixel variations.\")\n" + "\"An\n" ] }, { @@ -773,9 +771,7 @@ "metadata": {}, "source": [ "
Animated GIF of the v1.0 and v2.0 FLC image subsections:
\n", - "\n", - "\n", - "![An animated GIF of the v1.0 and v2.0 FLC image subsections](example/v1_v2_subsection.gif \"Animated GIF of the v1.0 and v2.0 FLC image subsections\")" + "\"An" ] }, { @@ -791,8 +787,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "\n", - "![Aperture photometry illustration](example/apphot_image.png \"Aperture photometry illustration\")" + "\"Aperture" ] }, { From 0471b325d072d1f7c8180145610a99643a10fa72 Mon Sep 17 00:00:00 2001 From: Michael Dulude Date: Mon, 11 Dec 2023 12:31:06 -0500 Subject: [PATCH 27/30] Add WFC3 notebook 'IR_IMA_Visualization_with_an_Example_of_Time_Variable_Background.ipynb' (#106) * updated _toc.yml and _config.yml files * ima_visualization_and_differencing.py: added trailing line * IR_IMA_Visualization_with_an_Example_of_Time_Variable_Background.ipynb: cleared all notebook outputs. * PEP8 corrections and removing unecessary imports, pinnin working code version in requirements.txt * a few more PEP8 corrections, removing Python version from requirement.txt * Recommended changes after notebook review. * Fix style error --------- Co-authored-by: annierose3 Co-authored-by: annierose3 <112646956+annierose3@users.noreply.github.com> Co-authored-by: Fred Dauphin <53876625+FDauphin@users.noreply.github.com> Co-authored-by: Hatice Karatay --- _config.yml | 1 + _toc.yml | 2 +- ..._Example_of_Time_Variable_Background.ipynb | 215 ++++++++++-------- .../ima_visualization_and_differencing.py | 206 +++++++++-------- .../ir_ima_visualization/requirements.txt | 2 +- 5 files changed, 228 insertions(+), 198 deletions(-) diff --git a/_config.yml b/_config.yml index 3cf6d03b8..894344bab 100644 --- a/_config.yml +++ b/_config.yml @@ -53,6 +53,7 @@ exclude_patterns: [notebooks/DrizzlePac/align_mosaics/align_mosaics.ipynb, notebooks/WFC3/dash/dash.ipynb, notebooks/WFC3/filter_transformations/filter_transformations.ipynb, notebooks/WFC3/flux_conversion_tool/flux_conversion_tool.ipynb, + notebooks/WFC3/image_displayer_analyzer/wfc3_image_displayer_analyzer.ipynb, notebooks/WFC3/ir_ima_visualization/IR_IMA_Visualization_with_an_Example_of_Time_Variable_Background.ipynb, notebooks/WFC3/ir_scattered_light_calwf3_corrections/Correcting_for_Scattered_Light_in_IR_Exposures_Using_calwf3_to_Mask_Bad_Reads.ipynb, notebooks/WFC3/ir_scattered_light_manual_corrections/Correcting_for_Scattered_Light_in_IR_Exposures_by_Manually_Subtracting_Bad_Reads.ipynb, diff --git a/_toc.yml b/_toc.yml index 89758b603..cec17e3ab 100644 --- a/_toc.yml +++ b/_toc.yml @@ -66,7 +66,7 @@ parts: # - file: notebooks/WFC3/filter_transformations/filter_transformations.ipynb # - file: notebooks/WFC3/flux_conversion_tool/flux_conversion_tool.ipynb - file: notebooks/WFC3/image_displayer_analyzer/wfc3_image_displayer_analyzer.ipynb -# - file: notebooks/WFC3/ir_ima_visualization/IR_IMA_Visualization_with_an_Example_of_Time_Variable_Background.ipynb + - file: notebooks/WFC3/ir_ima_visualization/IR_IMA_Visualization_with_an_Example_of_Time_Variable_Background.ipynb # - file: notebooks/WFC3/ir_scattered_light_calwf3_corrections/Correcting_for_Scattered_Light_in_IR_Exposures_Using_calwf3_to_Mask_Bad_Reads.ipynb # - file: notebooks/WFC3/ir_scattered_light_manual_corrections/Correcting_for_Scattered_Light_in_IR_Exposures_by_Manually_Subtracting_Bad_Reads.ipynb - file: notebooks/WFC3/persistence/wfc3_ir_persistence.ipynb diff --git a/notebooks/WFC3/ir_ima_visualization/IR_IMA_Visualization_with_an_Example_of_Time_Variable_Background.ipynb b/notebooks/WFC3/ir_ima_visualization/IR_IMA_Visualization_with_an_Example_of_Time_Variable_Background.ipynb index 98f7af2d1..7dc2f10eb 100644 --- a/notebooks/WFC3/ir_ima_visualization/IR_IMA_Visualization_with_an_Example_of_Time_Variable_Background.ipynb +++ b/notebooks/WFC3/ir_ima_visualization/IR_IMA_Visualization_with_an_Example_of_Time_Variable_Background.ipynb @@ -96,7 +96,6 @@ "- *matplotlib.pyplot* for plotting data\n", "- *astropy.io fits* for accessing FITS files\n", "- *astroquery* for downlaoding data from MAST\n", - "- *ginga* for scaling using zscale\n", "\n", "We import the following module:\n", "- *ima_visualization_and_differencing* to take the difference between reads, plot the ramp, and to visualize the difference in images\n" @@ -109,15 +108,10 @@ "outputs": [], "source": [ "import os\n", - "\n", "import numpy as np\n", "from matplotlib import pyplot as plt\n", - "import matplotlib.patheffects as path_effects\n", - "\n", "from astropy.io import fits\n", "from astroquery.mast import Observations\n", - "from ginga.util.zscale import zscale\n", - "\n", "import ima_visualization_and_differencing as diff\n", "\n", "%matplotlib inline" @@ -162,9 +156,9 @@ "metadata": {}, "outputs": [], "source": [ - "EX_OBS = Observations.query_criteria(obs_id = 'ICQTBB020')\n", + "EX_OBS = Observations.query_criteria(obs_id='ICQTBB020')\n", "EXOBS_Prods = Observations.get_product_list(EX_OBS)\n", - "yourProd = Observations.filter_products(EXOBS_Prods, obs_id = ['icqtbbbxq', 'icqtbbc0q'], extension = [\"_ima.fits\",\"_flt.fits\"])" + "yourProd = Observations.filter_products(EXOBS_Prods, obs_id=['icqtbbbxq', 'icqtbbc0q'], extension=[\"_ima.fits\", \"_flt.fits\"])" ] }, { @@ -182,7 +176,7 @@ "metadata": {}, "outputs": [], "source": [ - "Observations.download_products(yourProd, mrp_only = False, cache = False)" + "Observations.download_products(yourProd, mrp_only=False, cache=False)" ] }, { @@ -194,13 +188,13 @@ "\n", "The figure below shows the WFC3/IR file structure corresponding to [Figure 2.4 of the Data Handbook](https://hst-docs.stsci.edu/wfc3dhb/chapter-2-wfc3-data-structure/2-2-wfc3-file-structure). Note that for WFC3/IR data, each read or image set (IMSET) consists of five data arrays: SCI, ERR, DQ, SAMP, TIME. Consecutive MULTIACCUM readouts are stored in reverse chronological order, with [SCI,1] corresponding to the final, cumulative exposure. \n", "\n", - "\"Drawing\"\n", + "\"Diagram\n", "\n", "The table below lists the IMSET, SAMPNUM, and SAMPTIME for a WFC3/IR SPARS100 exposure, modified from [Section 7.7 of the Instrument Handbook](https://hst-docs.stsci.edu/wfc3ihb/chapter-7-ir-imaging-with-wfc3/7-7-ir-exposure-and-readout). Note that the image header keyword NSAMP reports a value of 16, but there are actually 15 science reads in the IMA file, following the 0th read (which has an exposure time of 0). While NSAMP keyword is reported in the primary header (extension 0), the SAMPNUM and SAMPTIME keywords may be found in the science header of each read, and these report the read (IMSET) number and the cumulative exposure time of each respective read. \n", "\n", "This table is similar to [Table 7.7](https://hst-docs.stsci.edu/wfc3ihb/chapter-7-ir-imaging-with-wfc3/7-7-ir-exposure-and-readout#id-7.7IRExposureandReadout-table7.8), except that the column labelled NSAMP in the handbook is really the SAMPNUM. Note that we have added a row at the top of the table to highlight that IMSET [SCI,16] corresponds to the 0th read.\n", "\n", - "\"Drawing\"" + "\"Table" ] }, { @@ -231,7 +225,7 @@ "flt_nominal = 'mastDownload/HST/icqtbbc0q/icqtbbc0q_flt.fits'\n", "\n", "image = fits.open(ima_scattered)\n", - "image.info()\n" + "image.info()" ] }, { @@ -258,10 +252,10 @@ "metadata": {}, "outputs": [], "source": [ - "sci16hdr = image['SCI',16].header\n", + "sci16hdr = image['SCI', 16].header\n", "SAMPNUM_sci16 = sci16hdr['SAMPNUM']\n", "SAMPTIME_sci16 = sci16hdr['SAMPTIME']\n", - "print(f'For sample number {SAMPNUM_sci16}, the exposure time is {SAMPTIME_sci16}s.')\n" + "print(f'For sample number {SAMPNUM_sci16}, the exposure time is {SAMPTIME_sci16}s.')" ] }, { @@ -270,7 +264,7 @@ "metadata": {}, "outputs": [], "source": [ - "sci1hdr = image['SCI',1].header\n", + "sci1hdr = image['SCI', 1].header\n", "SAMPNUM_sci1 = sci1hdr['SAMPNUM']\n", "SAMPTIME_sci1 = sci1hdr['SAMPTIME']\n", "print(f'For sample number {SAMPNUM_sci1}, the exposure time is {SAMPTIME_sci1:.3f}s.')\n", @@ -298,35 +292,32 @@ "metadata": {}, "outputs": [], "source": [ - "fig = plt.figure(figsize = (30, 30))\n", - "fig\n", + "fig = plt.figure(figsize=(30, 30))\n", "rows = 2\n", "columns = 2\n", + "files = [ima_scattered, flt_scattered, ima_nominal, flt_nominal]\n", "\n", - "files = [ima_scattered, flt_scattered, ima_nominal, flt_nominal] \n", - "# If only analyzing one image, please remove the second ima,flt pair from the list \n", + "# If only analyzing one image, please remove the second ima,flt pair from the list\n", + "subplot_titles = ['scattered', 'nominal']\n", "\n", - "for i,file in enumerate(files):\n", + "for i, file in enumerate(files):\n", " path, filename = os.path.split(file)\n", - " \n", - " image = fits.open(file)\n", "\n", - " ax = fig.add_subplot(rows, columns, i+1)\n", - " ax.set_title(filename, fontsize = 20)\n", - " \n", - " subplot_titles = ['scattered', 'nominal']\n", - " #Please change the vmin and vmax values to fit your own data\n", - " if i == 0 or i == 1:\n", - " ax.set_title(f'{filename}, {subplot_titles[i//2]}', fontsize = 20)\n", - " im = ax.imshow(image[\"SCI\", 1].data, origin = 'lower',cmap = 'Greys_r', vmin = 0.25, vmax = 1.7)\n", - " else:\n", - " ax.set_title(f'{filename}, {subplot_titles[i//2]}', fontsize = 20)\n", - " im = ax.imshow(image[\"SCI\", 1].data, origin = 'lower',cmap = 'Greys_r', vmin = 0.5, vmax = 1.2)\n", - " plt.colorbar(im, ax = ax)\n", + " with fits.open(file) as image:\n", + " ax = fig.add_subplot(rows, columns, i + 1)\n", + " title = f'{filename}, {subplot_titles[i//2]}'\n", + " ax.set_title(title, fontsize=20)\n", + "\n", + " # Please change the vmin and vmax values to fit your own data\n", + " vmin, vmax = (0.25, 1.7) if i < 2 else (0.5, 1.2)\n", + " im = ax.imshow(\n", + " image['SCI', 1].data, origin='lower', cmap='Greys_r', vmin=vmin, vmax=vmax\n", + " )\n", + " plt.colorbar(im, ax=ax)\n", "\n", - "plt.subplots_adjust(bottom = 0.2, right = 0.5, top = 0.5)\n", - "plt.rc('xtick', labelsize = 10) \n", - "plt.rc('ytick', labelsize = 10) " + "plt.subplots_adjust(bottom=0.2, right=0.5, top=0.5)\n", + "plt.rc('xtick', labelsize=10)\n", + "plt.rc('ytick', labelsize=10)" ] }, { @@ -354,10 +345,10 @@ "outputs": [], "source": [ "try:\n", - " diff.plot_ima_subplots(ima_filename = ima_scattered, vmin = 0, vmax = 2.2)\n", - " \n", + " diff.plot_ima_subplots(ima_filename=ima_scattered, vmin=0, vmax=2.2)\n", + "\n", "except FileNotFoundError:\n", - " print(\"No file by this name found\")\n" + " print(\"No file by this name found\")" ] }, { @@ -376,7 +367,7 @@ "outputs": [], "source": [ "try:\n", - " diff.plot_ima_subplots(ima_filename = ima_nominal, vmin = 0, vmax = 2)\n", + " diff.plot_ima_subplots(ima_filename=ima_nominal, vmin=0, vmax=2)\n", "\n", "except FileNotFoundError:\n", " print(\"No file by this name found\")" @@ -395,31 +386,31 @@ "metadata": {}, "outputs": [], "source": [ - "fig = plt.figure(figsize = (10, 8))\n", + "fig = plt.figure(figsize=(10, 8))\n", "\n", - "ima_files=[ima_scattered, ima_nominal] \n", - "#If only using one image, please remove the extraneous image from this list \n", + "ima_files = [ima_scattered, ima_nominal] \n", + "# If only using one image, please remove the extraneous image from this list \n", "\n", - "marker_select = ['o','s']\n", - "color_select = ['black','C0']\n", + "marker_select = ['o', 's']\n", + "color_select = ['black', 'C0']\n", + "plt.rcParams.update({'font.size': 15})\n", "for i, ima in enumerate(ima_files):\n", " path, filename = os.path.split(ima)\n", - " \n", + "\n", " cube, integ_time = diff.read_wfc3(ima)\n", - " median_fullframe = np.nanmedian(cube, axis = (0,1))\n", + " median_fullframe = np.nanmedian(cube, axis=(0, 1))\n", "\n", - " plt.rcParams.update({'font.size':15})\n", " plt.plot(integ_time[1:], median_fullframe[1:]*integ_time[1:],\n", - " marker = marker_select[i], markersize = 8, color = color_select[i], label = filename)\n", + " marker=marker_select[i], markersize=8, \n", + " color=color_select[i], label=filename)\n", " plt.legend()\n", "plt.grid()\n", - "plt.xlabel('Integ. Time (s)', fontsize = 15)\n", - "plt.ylabel('electrons', fontsize = 15)\n", - "plt.rc('xtick', labelsize = 15) \n", - "plt.rc('ytick', labelsize = 15) \n", - "plt.grid(visible = True)\n", - "_=plt.title(\"Comparison of Signal Accumulation Ramp in Nominal vs. Scattered Light Images\", fontsize=15)\n", - " " + "plt.xlabel('Integ. Time (s)', fontsize=15)\n", + "plt.ylabel('electrons', fontsize=15)\n", + "plt.rc('xtick', labelsize=15) \n", + "plt.rc('ytick', labelsize=15) \n", + "plt.grid(visible=True)\n", + "_ = plt.title(\"Comparison of Signal Accumulation Ramp in Nominal vs. Scattered Light Images\", fontsize=15)" ] }, { @@ -441,43 +432,51 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ - "fig = plt.figure(figsize = (50, 20))\n", + "fig = plt.figure(figsize=(50, 20))\n", "fig\n", "rows = 1\n", "columns = 2\n", "ima_files = [ima_scattered, ima_nominal] \n", - "#If only using one image, please remove the extraneous image from this list \n", + "# If only using one image, please remove the extraneous image from this list \n", "\n", "subplot_titles = ['scattered', 'nominal']\n", "\n", + "lhs_region = {\"x0\": 50, \"x1\": 250, \"y0\": 100, \"y1\": 900}\n", + "rhs_region = {\"x0\": 700, \"x1\": 900, \"y0\": 100, \"y1\": 900}\n", + "\n", + "plt.rcParams.update({'font.size': 40})\n", + "\n", + "for i, ima in enumerate(ima_files):\n", "\n", - "lhs_region={\"x0\":50,\"x1\":250,\"y0\":100,\"y1\":900}\n", - "rhs_region={\"x0\":700,\"x1\":900,\"y0\":100,\"y1\":900}\n", - "for i,ima in enumerate(ima_files):\n", - " \n", " path, filename = os.path.split(ima)\n", - " \n", + "\n", " cube, integ_time = diff.read_wfc3(ima)\n", + "\n", + " median_fullframe, median_lhs, median_rhs = diff.get_median_fullframe_lhs_rhs(cube, \n", + " lhs_region=lhs_region, \n", + " rhs_region=rhs_region)\n", " \n", - " median_fullframe, median_lhs, median_rhs = diff.get_median_fullframe_lhs_rhs(cube, lhs_region = lhs_region, rhs_region = rhs_region)\n", - " plt.rcParams.update({'font.size':40})\n", " ax = fig.add_subplot(rows, columns, i+1)\n", - " ax.plot(integ_time[1:], median_fullframe[1:]*integ_time[1:], 's', markersize = 25, label = 'Full Frame', color = 'black')\n", - " ax.plot(integ_time[1:], median_lhs[1:]*integ_time[1:], '<', markersize = 20, label = 'LHS', color = 'C1')\n", - " ax.plot(integ_time[1:], median_rhs[1:]*integ_time[1:], '>', markersize = 20, label = 'RHS', color= 'C2')\n", - " ax.set_ylim(0,1800)\n", + " ax.plot(integ_time[1:], median_fullframe[1:]*integ_time[1:], 's', \n", + " markersize=25, label='Full Frame', color='black')\n", + " \n", + " ax.plot(integ_time[1:], median_lhs[1:]*integ_time[1:], '<', \n", + " markersize=20, label='LHS', color='C1')\n", + " \n", + " ax.plot(integ_time[1:], median_rhs[1:]*integ_time[1:], '>', \n", + " markersize=20, label='RHS', color='C2')\n", + " \n", + " ax.set_ylim(0, 1800)\n", " ax.grid()\n", " ax.set_xlabel('Integ. Time (s)')\n", " ax.set_ylabel('electrons')\n", - " ax.legend(loc = 0)\n", - " _=ax.set_title(f'{filename}, {subplot_titles[i]}', fontsize = 40)\n", - " ax.tick_params(axis = \"x\", labelsize = 30) \n", - " ax.tick_params(axis = \"y\", labelsize = 30) " + " ax.legend(loc=0)\n", + " _ = ax.set_title(f'{filename}, {subplot_titles[i]}', fontsize=40)\n", + " ax.tick_params(axis=\"x\", labelsize=30) \n", + " ax.tick_params(axis=\"y\", labelsize=30) " ] }, { @@ -514,10 +513,15 @@ "metadata": {}, "outputs": [], "source": [ - "##If only using one image, please remove the extraneous image from this list \n", - "lhs_region = {\"x0\":50,\"x1\":250,\"y0\":100,\"y1\":900}\n", - "rhs_region = {\"x0\":700,\"x1\":900,\"y0\":100,\"y1\":900}\n", - "diff.plot_ramp_subplots(ima_files = [ima_scattered, ima_nominal], difference_method = 'cumulative', exclude_sources = False, ylims = [-0.3,0.3], lhs_region = lhs_region, rhs_region = rhs_region)" + "# If only using one image, please remove the extraneous image from this list \n", + "lhs_region = {\"x0\": 50, \"x1\": 250, \"y0\": 100, \"y1\": 900}\n", + "rhs_region = {\"x0\": 700, \"x1\": 900, \"y0\": 100, \"y1\": 900}\n", + "diff.plot_ramp_subplots(ima_files=[ima_scattered, ima_nominal], \n", + " difference_method='cumulative', \n", + " exclude_sources=False, \n", + " ylims=[-0.3, 0.3], \n", + " lhs_region=lhs_region, \n", + " rhs_region=rhs_region)" ] }, { @@ -540,9 +544,12 @@ "outputs": [], "source": [ "try:\n", - " lhs_region = {\"x0\":50,\"x1\":250,\"y0\":100,\"y1\":900}\n", - " rhs_region = {\"x0\":700,\"x1\":900,\"y0\":100,\"y1\":900}\n", - " diff.plot_ima_difference_subplots(ima_filename = ima_scattered, difference_method = 'cumulative', lhs_region = lhs_region, rhs_region = rhs_region)\n", + " lhs_region = {\"x0\": 50, \"x1\": 250, \"y0\": 100, \"y1\": 900}\n", + " rhs_region = {\"x0\": 700, \"x1\": 900, \"y0\": 100, \"y1\": 900}\n", + " diff.plot_ima_difference_subplots(ima_filename=ima_scattered, \n", + " difference_method='cumulative', \n", + " lhs_region=lhs_region, \n", + " rhs_region=rhs_region)\n", " \n", "except FileNotFoundError:\n", " print(\"No file by this name found\")" @@ -570,9 +577,12 @@ "outputs": [], "source": [ "try:\n", - " lhs_region = {\"x0\":50,\"x1\":250,\"y0\":100,\"y1\":900}\n", - " rhs_region = {\"x0\":700,\"x1\":900,\"y0\":100,\"y1\":900}\n", - " diff.plot_ima_difference_subplots(ima_filename = ima_nominal, difference_method = 'cumulative', lhs_region = lhs_region, rhs_region = rhs_region)\n", + " lhs_region = {\"x0\": 50, \"x1\": 250, \"y0\": 100, \"y1\": 900}\n", + " rhs_region = {\"x0\": 700, \"x1\": 900, \"y0\": 100, \"y1\": 900}\n", + " diff.plot_ima_difference_subplots(ima_filename=ima_nominal, \n", + " difference_method='cumulative', \n", + " lhs_region=lhs_region, \n", + " rhs_region=rhs_region)\n", "\n", "except FileNotFoundError:\n", " print(\"No file by this name found\")" @@ -609,10 +619,15 @@ "metadata": {}, "outputs": [], "source": [ - "#If only using one image, please remove the extraneous image from this list \n", - "lhs_region = {\"x0\":50,\"x1\":250,\"y0\":100,\"y1\":900}\n", - "rhs_region = {\"x0\":700,\"x1\":900,\"y0\":100,\"y1\":900}\n", - "diff.plot_ramp_subplots(ima_files = [ima_scattered, ima_nominal], difference_method = 'instantaneous', exclude_sources = True, ylims = [0.5,2.5], lhs_region = lhs_region, rhs_region = rhs_region)" + "# If only using one image, please remove the extraneous image from this list \n", + "lhs_region = {\"x0\": 50, \"x1\": 250, \"y0\": 100, \"y1\": 900}\n", + "rhs_region = {\"x0\": 700, \"x1\": 900, \"y0\": 100, \"y1\": 900}\n", + "diff.plot_ramp_subplots(ima_files=[ima_scattered, ima_nominal], \n", + " difference_method='instantaneous', \n", + " exclude_sources=True, \n", + " ylims=[0.5, 2.5], \n", + " lhs_region=lhs_region, \n", + " rhs_region=rhs_region)" ] }, { @@ -636,9 +651,12 @@ "outputs": [], "source": [ "try:\n", - " lhs_region = {\"x0\":50,\"x1\":250,\"y0\":100,\"y1\":900}\n", - " rhs_region = {\"x0\":700,\"x1\":900,\"y0\":100,\"y1\":900}\n", - " diff.plot_ima_difference_subplots(ima_filename = ima_scattered, difference_method = 'instantaneous', lhs_region = lhs_region, rhs_region = rhs_region)\n", + " lhs_region = {\"x0\": 50, \"x1\": 250, \"y0\": 100, \"y1\": 900}\n", + " rhs_region = {\"x0\": 700, \"x1\": 900, \"y0\": 100, \"y1\": 900}\n", + " diff.plot_ima_difference_subplots(ima_filename=ima_scattered, \n", + " difference_method='instantaneous', \n", + " lhs_region=lhs_region, \n", + " rhs_region=rhs_region)\n", "\n", "except FileNotFoundError:\n", " print(\"No file by this name found\")" @@ -664,7 +682,10 @@ "outputs": [], "source": [ "try:\n", - " diff.plot_ima_difference_subplots(ima_filename = ima_nominal, difference_method = 'instantaneous', lhs_region = lhs_region, rhs_region = rhs_region)\n", + " diff.plot_ima_difference_subplots(ima_filename=ima_nominal, \n", + " difference_method='instantaneous', \n", + " lhs_region=lhs_region, \n", + " rhs_region=rhs_region)\n", " \n", "except FileNotFoundError:\n", " print(\"No file by this name found\")" @@ -770,9 +791,9 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.11.0" + "version": "3.11.6" } }, "nbformat": 4, - "nbformat_minor": 2 + "nbformat_minor": 4 } diff --git a/notebooks/WFC3/ir_ima_visualization/ima_visualization_and_differencing.py b/notebooks/WFC3/ir_ima_visualization/ima_visualization_and_differencing.py index d174df612..d43e744b7 100644 --- a/notebooks/WFC3/ir_ima_visualization/ima_visualization_and_differencing.py +++ b/notebooks/WFC3/ir_ima_visualization/ima_visualization_and_differencing.py @@ -5,6 +5,7 @@ from ginga.util.zscale import zscale import matplotlib.patheffects as path_effects + def read_wfc3(filename): ''' Read a full-frame IR image and return the datacube plus integration times for each read. @@ -24,26 +25,23 @@ def read_wfc3(filename): integ_time : array-like Integration times associated with the datacube in ascending order. ''' - with fits.open(filename) as f: hdr = f[0].header NSAMP = hdr['NSAMP'] hdr1 = f[1].header - cube = np.zeros((hdr1['NAXIS1'],hdr1['NAXIS2'],NSAMP), dtype = float) - integ_time = np.zeros(shape = (NSAMP)) + cube = np.zeros((hdr1['NAXIS1'], hdr1['NAXIS2'], NSAMP), dtype=float) + integ_time = np.zeros(shape=NSAMP) for i in range(1, NSAMP+1): - cube[:,:,i-1] = f[('SCI', i)].data + cube[:, :, i-1] = f[('SCI', i)].data integ_time[i-1] = f[('TIME', i)].header['PIXVALUE'] - cube = cube[:,:,::-1] + cube = cube[:, :, ::-1] integ_time = integ_time[::-1] return cube, integ_time - def compute_diff_imas(cube, integ_time, diff_method): - ''' Compute the difference in signal between reads of a WFC3 IR IMA file. @@ -66,10 +64,9 @@ def compute_diff_imas(cube, integ_time, diff_method): 1024x1024x(NSAMP-1) datacube of the differebce between IR IMA reads in ascending time order, where NSAMP is the number of samples taken. ''' - if diff_method == 'instantaneous': ima_j = cube[:, :, 1:] - ima_j_1 = cube[:,:,0:-1] + ima_j_1 = cube[:, :, 0:-1] t_0 = integ_time[0] t_j = integ_time[1:] t_j_1 = integ_time[0:-1] @@ -77,7 +74,7 @@ def compute_diff_imas(cube, integ_time, diff_method): diff = ((ima_j*(t_j-t_0))-(ima_j_1*(t_j_1-t_0)))/(t_j-t_j_1) elif diff_method == 'cumulative': - diff = cube[:,:,0:-1] - cube[:,:,1:] + diff = cube[:, :, 0:-1] - cube[:, :, 1:] else: # if an incorrect method is chosen raise an error raise ValueError(f"{diff_method} is an invalid method. The allowed methods are 'instantaneous' and 'cumulative'.") @@ -86,7 +83,6 @@ def compute_diff_imas(cube, integ_time, diff_method): def get_median_fullframe_lhs_rhs(cube, lhs_region, rhs_region): - ''' Compute the median in the full-frame image, the user-defined left side region, and the user-defined right side region. @@ -113,19 +109,19 @@ def get_median_fullframe_lhs_rhs(cube, lhs_region, rhs_region): median_rhs : array of floats The median signal of the right side of each read. ''' - - - median_full_frame = np.nanmedian(cube[5:-5,5:-5,:], axis = (0,1)) + median_full_frame = np.nanmedian(cube[5:-5, 5:-5, :], + axis=(0, 1)) median_lhs = np.nanmedian(cube[lhs_region['y0']:lhs_region['y1'], - lhs_region['x0']:lhs_region['x1'],:], axis = (0,1)) + lhs_region['x0']:lhs_region['x1'], :], + axis=(0, 1)) median_rhs = np.nanmedian(cube[rhs_region['y0']:rhs_region['y1'], - rhs_region['x0']:rhs_region['x1'],:], axis = (0,1)) - + rhs_region['x0']:rhs_region['x1'], :], + axis=(0, 1)) return median_full_frame, median_lhs, median_rhs + def get_std_fullframe_lhs_rhs(cube, lhs_region, rhs_region): - ''' Compute the standard deviation of the signal in the full-frame image, the user-defined left side region, and the user-defined right side region. @@ -154,20 +150,19 @@ def get_std_fullframe_lhs_rhs(cube, lhs_region, rhs_region): standard_dev_rhs : array of floats The standard deviation of the signal of the right side of each read. ''' - - - standard_dev_fullframe = np.nanstd(cube[5:-5,5:-5,:], axis = (0,1)) + standard_dev_fullframe = np.nanstd(cube[5:-5, 5:-5, :], + axis=(0, 1)) standard_dev_lhs = np.nanstd(cube[lhs_region['y0']:lhs_region['y1'], - lhs_region['x0']:lhs_region['x1'],:], axis = (0,1)) + lhs_region['x0']:lhs_region['x1'], :], + axis=(0, 1)) standard_dev_rhs = np.nanstd(cube[rhs_region['y0']:rhs_region['y1'], - rhs_region['x0']:rhs_region['x1'],:], axis = (0,1)) - - + rhs_region['x0']:rhs_region['x1'], :], + axis=(0, 1)) + return standard_dev_fullframe, standard_dev_lhs, standard_dev_rhs - + def plot_ramp(ima, integ_time, median_diff_fullframe, median_diff_lhs, median_diff_rhs): - ''' Plots the signal accumulation ramp of an IMA image. Each point is the median signal (in e-/s) of the difference between subsequent reads. The median signal difference is plotted for the full @@ -190,19 +185,22 @@ def plot_ramp(ima, integ_time, median_diff_fullframe, median_diff_lhs, median_di median_diff_rhs: array-like The median difference in signal between the right side of each read. ''' - - plt.plot(integ_time[2:], median_diff_fullframe[1:], 's', markersize = 25, label = 'Full Frame', color = 'black') - plt.plot(integ_time[2:], median_diff_lhs[1:], '<', markersize = 20, label = 'LHS', color = 'orange') - plt.plot(integ_time[2:], median_diff_rhs[1:], '>', markersize = 20, label = 'RHS', color = 'green') + plt.plot(integ_time[2:], median_diff_fullframe[1:], 's', markersize=25, + label='Full Frame', color='black') + plt.plot(integ_time[2:], median_diff_lhs[1:], '<', markersize=20, + label='LHS', color='orange') + plt.plot(integ_time[2:], median_diff_rhs[1:], '>', markersize=20, + label='RHS', color='green') ax = plt.gca() - for spine in ['top', 'bottom', 'left', 'right']: ax.spines[spine].set_visible(False) + for spine in ['top', 'bottom', 'left', 'right']: + ax.spines[spine].set_visible(False) plt.grid() plt.xlabel('SAMPTIME [s]') - plt.ylabel('$\mu$ [e-/s]') - plt.legend(loc = 0) + plt.ylabel(r'$\mu$ [e-/s]') + plt.legend(loc=0) plt.title(ima) - - + + def panel_plot(cube, integ_time, median_diff_full_frame, median_diff_lhs, median_diff_rhs, standard_dev_fullframe, standard_dev_lhs, standard_dev_rhs, diff_method): ''' @@ -245,7 +243,7 @@ def panel_plot(cube, integ_time, median_diff_full_frame, median_diff_lhs, median fig: figure object Panel plot with subplots showing the difference between subsequent IMA reads. - Above each panel, we print the median difference $\mu$ in the count rate over the entire image. + Above each panel, we print the median difference mu in the count rate over the entire image. Below each panel, we list the IMSET difference, along with the time interval between the two IMSETs. The statistics in orange (on the left and right side of each panel) give the median rate and standard deviation of each side of the image, respectively. The value in green 'delta' is the @@ -253,54 +251,52 @@ def panel_plot(cube, integ_time, median_diff_full_frame, median_diff_lhs, median The value in white "Ratio" gives the ratio of the median difference in orange for the left versus the right side. ''' - - - xlabel_list = ["SCI[16-15]","SCI[15-14]","SCI[14-13]","SCI[13-12]","SCI[12-11]", - "SCI[11-10]","SCI[10-9]","SCI[9-8]","SCI[8-7]","SCI[[7-6]]","SCI[6-5]", - "SCI[5-4]","SCI[4-3]","SCI[3-2]","SCI[2-1]"] + xlabel_list = ["SCI[16-15]", "SCI[15-14]", "SCI[14-13]", "SCI[13-12]", "SCI[12-11]", + "SCI[11-10]", "SCI[10-9]", "SCI[9-8]", "SCI[8-7]", "SCI[[7-6]]", "SCI[6-5]", + "SCI[5-4]", "SCI[4-3]", "SCI[3-2]", "SCI[2-1]"] fig, axarr = plt.subplots(4, 4) fig.set_size_inches(40, 40) fig.set_dpi(40) itime = integ_time[0:-1] - integ_time[1:] - diff = compute_diff_imas(cube, integ_time, diff_method = diff_method) - - + diff = compute_diff_imas(cube, integ_time, diff_method=diff_method) + for i, ax in enumerate(axarr.reshape(-1)): - if (i < cube.shape[-1]-2): - i=i+1 + if i < cube.shape[-1]-2: + i += 1 - diff_i = diff[:,:,i] - vmin,vmax = zscale(diff_i) + diff_i = diff[:, :, i] + vmin, vmax = zscale(diff_i) im = ax.imshow(np.abs(diff_i), cmap='Greys_r', origin='lower', - vmin = vmin, vmax = vmax) - ax.set_title(f'$\mu = ${median_diff_full_frame[i]:.2f}±{standard_dev_fullframe[i]:.2f} e-/s', fontsize = 30) - - text = ax.text(50, 500, f'{median_diff_lhs[i]:.3f}\n±\n{standard_dev_lhs[i]:.3f}', color='Orange', fontsize=30) + vmin=vmin, vmax=vmax) + title = fr'$\mu = ${median_diff_full_frame[i]:.2f}±{standard_dev_fullframe[i]:.2f} e-/s' + ax.set_title(title, fontsize=30) + text_lhs = f'{median_diff_lhs[i]:.3f}\n±\n{standard_dev_lhs[i]:.3f}' + text = ax.text(50, 500, text_lhs, color='Orange', fontsize=30) text.set_path_effects([path_effects.Stroke(linewidth=15, foreground='black'), - path_effects.Normal()]) - text = ax.text(700, 500, f'{median_diff_rhs[i]:.3f}\n±\n{standard_dev_rhs[i]:.3f}', color='Orange', fontsize=30) + path_effects.Normal()]) + text_rhs = f'{median_diff_rhs[i]:.3f}\n±\n{standard_dev_rhs[i]:.3f}' + text = ax.text(700, 500, text_rhs, color='Orange', fontsize=30) text.set_path_effects([path_effects.Stroke(linewidth=15, foreground='black'), - path_effects.Normal()]) - text = ax.text(200, 900, f'Ratio = {median_diff_lhs[i]/median_diff_rhs[i]:.2f}', color='White', fontsize=30) + path_effects.Normal()]) + text_ratio = f'Ratio = {median_diff_lhs[i]/median_diff_rhs[i]:.2f}' + text = ax.text(200, 900, text_ratio, color='White', fontsize=30) text.set_path_effects([path_effects.Stroke(linewidth=15, foreground='black'), - path_effects.Normal()]) - text = ax.text(300, 300, f'$\Delta = ${median_diff_lhs[i]-median_diff_rhs[i]:.2f}', color='#32CD32', fontsize=30) + path_effects.Normal()]) + text_delta = fr'$\Delta = ${median_diff_lhs[i]-median_diff_rhs[i]:.2f}' + text = ax.text(300, 300, text_delta, color='#32CD32', fontsize=30) text.set_path_effects([path_effects.Stroke(linewidth=15, foreground='black'), - path_effects.Normal()]) + path_effects.Normal()]) - cbar = plt.colorbar(im, ax = ax) - cbar.ax.tick_params(labelsize = 20) - + cbar = plt.colorbar(im, ax=ax) + cbar.ax.tick_params(labelsize=20) ax.set_yticklabels([]) ax.set_xticklabels([]) - ax.set_xlabel(f'{xlabel_list[i]}, $\Delta t = ${np.abs(itime[i]):.2f} sec', fontsize = 30) - + ax.set_xlabel(fr'{xlabel_list[i]}, $\Delta t = ${np.abs(itime[i]):.2f} sec', fontsize=30) else: - ax.set_axis_off() return fig @@ -321,7 +317,6 @@ def plot_ima_subplots(ima_filename, vmin, vmax): vmax: float Maximum magnitude for scaling the data range that the colormap covers. ''' - path, filename = os.path.split(ima_filename) cube, integ_time = read_wfc3(ima_filename) @@ -329,23 +324,22 @@ def plot_ima_subplots(ima_filename, vmin, vmax): fig_panel1, axarr = plt.subplots(4, 4) fig_panel1.set_size_inches(40, 40) fig_panel1.set_dpi(40) - plt.rcParams.update({'font.size':40}) - itime = integ_time[0:-1] - integ_time[1:] - read_title=np.arange(16,0,-1) + plt.rcParams.update({'font.size': 40}) + read_title = np.arange(16, 0, -1) for i, ax in enumerate(axarr.reshape(-1)): - im = ax.imshow(cube[:,:,i], cmap = 'Greys_r', origin = 'lower', vmin = vmin , vmax = vmax) + im = ax.imshow(cube[:, :, i], cmap='Greys_r', origin='lower', vmin=vmin, vmax=vmax) - cbar=plt.colorbar(im, ax = ax) - cbar.ax.tick_params(labelsize = 20) - ax.set_title(f'SCI, {read_title[i]}', fontsize = 40) + cbar = plt.colorbar(im, ax=ax) + cbar.ax.tick_params(labelsize=20) + ax.set_title(f'SCI, {read_title[i]}', fontsize=40) ax.set_yticklabels([]) ax.set_xticklabels([]) - _=fig_panel1.suptitle(filename, fontsize = 40) - plt.subplots_adjust(bottom = 0.3, right = 0.9, top = 0.95) - - + _ = fig_panel1.suptitle(filename, fontsize=40) + plt.subplots_adjust(bottom=0.3, right=0.9, top=0.95) + + def plot_ramp_subplots(ima_files, difference_method, ylims, exclude_sources, lhs_region, rhs_region): ''' Build a simple figure with subplots of IMA accumulation ramps. @@ -372,40 +366,42 @@ def plot_ramp_subplots(ima_files, difference_method, ylims, exclude_sources, lhs rhs_region : dict The four corners (x0, x1, y0, y1) of the right hand region. ''' - - fig = plt.figure(figsize = (50, 20)) + fig = plt.figure(figsize=(50, 20)) fig rows = 1 columns = 2 subplot_titles = ['scattered', 'nominal'] - for i,ima in enumerate(ima_files): - + for i, ima in enumerate(ima_files): path, filename = os.path.split(ima) cube, integ_time = read_wfc3(ima) - if exclude_sources == True: + if exclude_sources is True: cube[np.abs(cube) > 3] = np.nan - diff_cube = compute_diff_imas(cube, integ_time, diff_method = difference_method) - median_diff_fullframe, median_diff_lhs, median_diff_rhs = get_median_fullframe_lhs_rhs(diff_cube, lhs_region = lhs_region, rhs_region = rhs_region) + diff_cube = compute_diff_imas(cube, integ_time, diff_method=difference_method) + + median_diff_fullframe, median_diff_lhs, median_diff_rhs = ( + get_median_fullframe_lhs_rhs(diff_cube, + lhs_region=lhs_region, + rhs_region=rhs_region)) ax = fig.add_subplot(rows, columns, i+1) plot_ramp(ima, integ_time, median_diff_fullframe, median_diff_lhs, median_diff_rhs) - ax.set_ylim(ylims[0],ylims[1]) + ax.set_ylim(ylims[0], ylims[1]) - ax.tick_params(axis = "x", labelsize = 30) - ax.tick_params(axis = "y", labelsize = 30) + ax.tick_params(axis="x", labelsize=30) + ax.tick_params(axis="y", labelsize=30) - _=ax.set_title(f'{filename}, {subplot_titles[i]}', fontsize=50) + _ = ax.set_title(f'{filename}, {subplot_titles[i]}', fontsize=50) def plot_ima_difference_subplots(ima_filename, difference_method, lhs_region, rhs_region): ''' Build a complex panel plot of the difference between individual IMA reads. - The median difference $\mu$ in the count rate over the entire image is printed above each panel. Below each panel, + The median difference mu in the count rate over the entire image is printed above each panel. Below each panel, The IMSET difference, along with the time interval between the two IMSETs, is printed below. The statistics in orange (on the left and right side of each panel) give the median rate and standard deviation of each side of the image, respectively. The value in green 'delta' is the @@ -429,18 +425,30 @@ def plot_ima_difference_subplots(ima_filename, difference_method, lhs_region, rh ''' - path,filename = os.path.split(ima_filename) + path, filename = os.path.split(ima_filename) cube, integ_time = read_wfc3(ima_filename) - median_fullframe, median_lhs, median_rhs = get_median_fullframe_lhs_rhs(cube, lhs_region = lhs_region, rhs_region = rhs_region) - - diff_cube = compute_diff_imas(cube, integ_time, diff_method = difference_method) + median_fullframe, median_lhs, median_rhs = ( + get_median_fullframe_lhs_rhs(cube, + lhs_region=lhs_region, + rhs_region=rhs_region)) - median_diff_fullframe, median_diff_lhs, median_diff_rhs = get_median_fullframe_lhs_rhs(diff_cube, lhs_region = lhs_region, rhs_region = rhs_region) - standard_dev_fullframe, standard_dev_lhs, standard_dev_rhs = get_std_fullframe_lhs_rhs(diff_cube, lhs_region = lhs_region, rhs_region = rhs_region) + diff_cube = compute_diff_imas(cube, integ_time, diff_method=difference_method) - fig_0 = panel_plot(cube, integ_time, median_diff_fullframe, median_diff_lhs, median_diff_rhs, standard_dev_fullframe, standard_dev_lhs, standard_dev_rhs, diff_method = difference_method) - _=fig_0.suptitle(filename, fontsize = 40) - plt.subplots_adjust(bottom = 0.25, right = 0.9, top = 0.95) + median_diff_fullframe, median_diff_lhs, median_diff_rhs = ( + get_median_fullframe_lhs_rhs(diff_cube, + lhs_region=lhs_region, + rhs_region=rhs_region)) + + standard_dev_fullframe, standard_dev_lhs, standard_dev_rhs = ( + get_std_fullframe_lhs_rhs(diff_cube, + lhs_region=lhs_region, + rhs_region=rhs_region)) + fig_0 = panel_plot(cube, integ_time, median_diff_fullframe, median_diff_lhs, + median_diff_rhs, standard_dev_fullframe, standard_dev_lhs, + standard_dev_rhs, diff_method=difference_method) + + _ = fig_0.suptitle(filename, fontsize=40) + plt.subplots_adjust(bottom=0.25, right=0.9, top=0.95) diff --git a/notebooks/WFC3/ir_ima_visualization/requirements.txt b/notebooks/WFC3/ir_ima_visualization/requirements.txt index 6de0ed093..ee67db939 100644 --- a/notebooks/WFC3/ir_ima_visualization/requirements.txt +++ b/notebooks/WFC3/ir_ima_visualization/requirements.txt @@ -1,5 +1,5 @@ astropy==5.2.1 astroquery==0.4.6 -ginga==4.1.1 +ginga==4.0.1 matplotlib==3.7.0 numpy==1.23.4 From 8303468fe49cfa993f435b32a4a9b4a302a12fbe Mon Sep 17 00:00:00 2001 From: dulude Date: Mon, 11 Dec 2023 13:42:04 -0500 Subject: [PATCH 28/30] Setup.ipynb: Fixed "back to top of page" link --- notebooks/COS/Setup/Setup.ipynb | 7 +++---- 1 file changed, 3 insertions(+), 4 deletions(-) diff --git a/notebooks/COS/Setup/Setup.ipynb b/notebooks/COS/Setup/Setup.ipynb index f03afc4c1..761b5be3c 100644 --- a/notebooks/COS/Setup/Setup.ipynb +++ b/notebooks/COS/Setup/Setup.ipynb @@ -411,9 +411,8 @@ "\n", "> *This tutorial was generated to be in compliance with the [STScI style guides](https://github.com/spacetelescope/style-guides) and would like to cite the [Jupyter guide](https://github.com/spacetelescope/style-guides/blob/master/templates/example_notebook.ipynb) in particular.*\n", "\n", - "[Top of Page](#topS)\n", - "\"Space \n", - "\n" + "Top of Page\n", + "\"Space " ] } ], @@ -433,7 +432,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.10.9" + "version": "3.8.12" } }, "nbformat": 4, From 41567069d0d5ac85dad19b205fe1e7cdefb10f1c Mon Sep 17 00:00:00 2001 From: dulude Date: Tue, 12 Dec 2023 10:11:30 -0500 Subject: [PATCH 29/30] calwf3_with_v1.0_PCTE.ipynb: un-corrupted file. --- notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb b/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb index 07cfc9d4d..26ea2efa3 100644 --- a/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb +++ b/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb @@ -772,6 +772,7 @@ "source": [ "
Animated GIF of the v1.0 and v2.0 FLC image subsections:
\n", "\"An" + ] }, { "cell_type": "markdown", @@ -988,7 +989,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.11.0" + "version": "3.8.12" }, "varInspector": { "cols": { From b871343aeb1d47d123bfc4f17439bc1c6759e4b4 Mon Sep 17 00:00:00 2001 From: dulude Date: Tue, 12 Dec 2023 10:35:33 -0500 Subject: [PATCH 30/30] calwf3_with_v1.0_PCTE.ipynb: fixed scaling of second and third images --- notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb b/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb index 26ea2efa3..30d7f0033 100644 --- a/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb +++ b/notebooks/WFC3/calwf3_v1.0_cte/calwf3_with_v1.0_PCTE.ipynb @@ -771,7 +771,8 @@ "metadata": {}, "source": [ "
Animated GIF of the v1.0 and v2.0 FLC image subsections:
\n", - "\"An" + "\n", + "\"An" ] }, { @@ -787,7 +788,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "\"Aperture" + "\"Aperture" ] }, {