From a5eaa48efc0e012c4a69491a4cf20e1ac58f162a Mon Sep 17 00:00:00 2001 From: Mitchell Revalski <56605543+mrevalski@users.noreply.github.com> Date: Thu, 13 Jun 2024 14:31:01 -0400 Subject: [PATCH 01/14] Revised DrizzlePac Notebook: using_updated_astrometry_solutions (#275) * Revised using_updated_astrometry_solutions.ipynb This notebook has been revised in consultation with J. Mack and the DrizzlePac team. The updates include an improved workflow for WCS solutions that solve previously encountered errors in the notebook, as well as numerous STScI DMD formatting and PEP8 compliance changes. The notebook was tested extensively with stenv2024.02.05. * Revised requirements.txt The revised requirements.txt is identical to the previous file in terms of required packages, with updated version numbers that were used in testing with stenv2024.02.05. * Cleared cell outputs Cleared the output messages from all cells. * updated documentation The documentation for several code cells and the notebooks overall workflow were improved by Jennifer Mack. * Apply context manager for file handling --------- Co-authored-by: Hatice Karatay --- .../requirements.txt | 13 +- .../using_updated_astrometry_solutions.ipynb | 665 ++++++++++++------ 2 files changed, 458 insertions(+), 220 deletions(-) diff --git a/notebooks/DrizzlePac/using_updated_astrometry_solutions/requirements.txt b/notebooks/DrizzlePac/using_updated_astrometry_solutions/requirements.txt index ba5142693..b89c9626e 100644 --- a/notebooks/DrizzlePac/using_updated_astrometry_solutions/requirements.txt +++ b/notebooks/DrizzlePac/using_updated_astrometry_solutions/requirements.txt @@ -1,7 +1,8 @@ -astropy==5.3.3 +astropy==6.0.0 astroquery==0.4.6 -drizzlepac==3.5.1 -matplotlib==3.7.0 -numpy==1.23.4 -stwcs==1.7.2 -crds +crds==11.17.15 +drizzlepac==3.6.2 +ipython==8.21.0 +matplotlib==3.8.2 +numpy==1.26.3 +jupyter==1.0.0 \ No newline at end of file diff --git a/notebooks/DrizzlePac/using_updated_astrometry_solutions/using_updated_astrometry_solutions.ipynb b/notebooks/DrizzlePac/using_updated_astrometry_solutions/using_updated_astrometry_solutions.ipynb index bd0f87234..5e2d53a9a 100644 --- a/notebooks/DrizzlePac/using_updated_astrometry_solutions/using_updated_astrometry_solutions.ipynb +++ b/notebooks/DrizzlePac/using_updated_astrometry_solutions/using_updated_astrometry_solutions.ipynb @@ -4,56 +4,79 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Using updated astrometry solutions" + "\n", + "# Improving Astrometry Using Alternate WCS Solutions" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "As part of an effort to improve the absolute astrometry of HST images, STScI has created multiple new astrometric solutions for ACS and WFC3 images. These solutions are contained in the World Coordinate System (WCS) of the images, as well as headerlet extesions. This notebook provides an example workflow showing how to use/change to a different solution." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The solutions are contained in the exposure files (`flt.fits` and `flc.fits`), so no extra files are required. However, these updates were implemented in December 2019 as a part of HST Data Processing version 2019.5. If data were downloaded before that, they can either be redownloaded (and will contain the new solutions) from mast.stsci.edu or via astroquery, or can be updated by connecting to the database cotaining solutions (shown below).\n", + "
This notebook requires creating and activating a virtual environment using the requirements file in this notebook's repository. Please also review the README file before using the notebook.
\n", "\n", - "For more information, see the overview page here: https://outerspace.stsci.edu/pages/viewpage.action?spaceKey=HAdP&title=Improvements+in+HST+Astrometry\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "
\n", - "REQUIREMENT: This notebook was designed around using stwcs version 1.5.3, and python version 3.6. If versions older than this are used the software will likely not work!\n", - "
" + "## Table of Contents\n", + "\n", + "\n", + "[Introduction](#intro)
\n", + "[Import Packages](#import)
\n", + "\n", + " 0. [Example Data Download](#0.-Example-Data-Download)\n", + " 1. [New Extensions on FITS Files](#1.-New-extensions-on-fits-files)\n", + " 2. [Exploring different solutions](#2.-Exploring-different-solutions)\n", + " 3. [Applying a headerlet to the science extensions](#3.-Applying-a-headerlet-to-the-science-extensions)\n", + " 4. [Changing to alternate WCS solutions](#4.-Changing-to-alternate-WCS-solutions)
\n", + "       4.1 [FIT-REL Gaia eDR3 solution](#4.1-FIT-REL-Gaia-eDR3-solution)
\n", + "       4.2 [\"a priori\" solution](#4.2-\"a-priori\"-solution)
\n", + "       4.3 [\"distortion-only\" solution](#4.3-\"distortion-only\"-solution)
\n", + "       4.4 [FIT-SVM Gaia DR2 solution](#4.4-FIT-SVM-Gaia-DR2-solution)
\n", + " 5. [Using downloaded SVM headerlets](#5.-Using-downloaded-SVM-headerlets)\n", + " 6. [Running AstroDrizzle](#6.-Running-AstroDrizzle)\n", + "\n", + "[Conclusions](#conclude)
\n", + "[About this Notebook](#about)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ + "## Introduction\n", + "\n", + "\n", + "Starting in December 2019, improved astrometric solutions for ACS and WFC3 images are available in the World Coordinate System (WCS) of the exposure file (`flt.fits` and `flc.fits`) FITS headers, with alternate WCS solutions appended as additional headerlet extensions. These solutions are also available as separate [headerlet](https://stwcs.readthedocs.io/en/latest/headerlet.html) FITS files which may be downloaded and applied to the FITS images.
\n", + "
\n", + "This notebook shows how to examine different WCS solutions contained in the FITS images and how to improve the relative alignment of exposures in the F225W and F336W filters which were taken in the same visit but which have different active WCS solutions.\n", + "\n", + "During the calibration portion of the pipeline processing, the drizzlepac software calls the [updatewcs](https://stwcs.readthedocs.io/en/latest/astrometry_utils.html#usage) module to populate the WCS headerlet extensions in the FITS images and then sets the 'bestSolutionID' as the active WCS. The astrometry database captures every unique WCS solution for a given dataset cataloged by 'ipppssoot' and includes WCS's derived from standard (HST) and Hubble Advanced Products (HAP). This gives us a complete history of the active WCS over time and these solutions can change as the alignment software is improved, as new distortion reference files are delivered, and/or as new reference catalogs become available. (A re-alignment is performed ONLY when there is a new distortion solution or absolute reference catalog.)\n", + "\n", "
\n", - "NOTE: The new solutions are used by default by the MAST pipeline! Some datasets may have certain solutions such as fitting to Gaia DR2, though other datasets (even in the same visit) may not! Thus, it is crucial to check the which solution (WCS) is active! The easiest way to check which solution is active is to check the WCSNAME keyword in the header of the SCI extensions.\n", - "
" + "NOTE: While some datasets may have WCS solutions which are aligned to an external reference catalog, such as Gaia eDR3, GSC v2.4.2 or 2MASS, other datasets (even in the same visit) may not! Thus, it is crucial to check which WCS solution is active for all of the exposures. The easiest way to do this is to examine the WCSNAME keyword in the header of the SCI extensions.
\n", + "
\n", + "Alternatively, a more accurate WCS solution may be available in the HAP Single Visit Mosaic (SVM) products created by MAST. In this workflow, the pipeline first aligns all of the images in a given visit and second aligns the entire group to an external reference catalog. Here, we show how to download the SVM headerlets and apply them to the FITS data to improve the relative aligment prior to running AstroDrizzle.\n", + "\n", + "\n", + "For more information alternate WCS solutions, see [Section 4.5](https://hst-docs.stsci.edu/drizzpac/chapter-4-astrometric-information-in-the-header/4-5-absolute-astrometry) of the Drizzlepac Handbook. For more details on the Hubble Advanced Products and Single Visit Mosaics (SVMs), see the following [MAST Newsletter Article](https://archive.stsci.edu/contents/newsletters/december-2020/hap-single-visit-mosaics-now-available)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "# Table Of Contents\n", + "\n", + "## Import Packages\n", + "[Table of Contents](#toc)\n", "\n", - " 0. [Example Data Download](#0.-Example-Data-Download)\n", - " 1. [New Extensions on FITS Files](#1.-New-extensions-on-fits-files)\n", - " 2. [Exploring different solutions](2.-Exploring-different-solutions)\n", - " 3. [Applying a headerlet to the science extensions](#3.-Applying-a-headerlet-to-the-science-extensions)\n", - " 4. [Restoring the \"old\" solution](#4.-Restoring-the-\"old\"-solution)\n", - " 5. [Inspecting alignment](#5.-Inspecting-alignment-(optional))\n", - " 6. [Changing the WCS of drizzled images](#6.-Changing-the-WCS-of-drizzled-images)" + "***\n", + "\n", + "The following Python packages are required to run the Jupyter Notebook:\n", + " - [**os**](https://docs.python.org/3/library/os.html) - change and make directories\n", + " - [**glob**](https://docs.python.org/3/library/glob.html) - gather lists of filenames\n", + " - [**shutil**](https://docs.python.org/3/library/shutil.html#module-shutil) - remove directories and files\n", + " - [**numpy**](https://numpy.org) - math and array functions\n", + " - [**matplotlib**](https://matplotlib.org/stable/tutorials/pyplot.html) - make figures and graphics\n", + " - [**astropy**](https://www.astropy.org) - file handling, tables, units, WCS, statistics\n", + " - [**astroquery**](https://astroquery.readthedocs.io/en/latest/) - download data and query databases\n", + " - [**drizzlepac**](https://www.stsci.edu/scientific-community/software/drizzlepac) - align and combine HST images" ] }, { @@ -62,28 +85,31 @@ "metadata": {}, "outputs": [], "source": [ - "import numpy as np\n", "import os\n", + "import glob\n", "import shutil\n", + "import numpy as np\n", "import matplotlib.pyplot as plt\n", - "\n", + "from IPython.display import clear_output\n", "from astropy.io import fits\n", + "from astropy.table import Table\n", "from astropy.wcs import WCS\n", - "from astropy.visualization import PercentileInterval, ImageNormalize, LogStretch\n", + "from astropy.visualization import ZScaleInterval\n", "from astroquery.mast import Observations\n", "from stwcs.wcsutil import headerlet\n", - "from stwcs import updatewcs\n", - "from drizzlepac.align import generate_astrometric_catalog\n", "from drizzlepac import astrodrizzle\n", - "\n", - "%matplotlib notebook" + "from drizzlepac.processInput import getMdriztabPars\n", + "from collections import defaultdict\n", + "%matplotlib notebook\n", + "%matplotlib inline\n", + "%config InlineBackend.figure_format = 'retina' # Greatly improves the resolution of figures rendered in notebooks." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Because some steps in this notebook require access to reference files, we will create a temporary 'iref' directory here to place these reference files after download. This step is typically done by defining the 'iref' path in your bash profile so that all reference files for all datasets can be in one static location, but for the portability of this notebook we will just create a temporary directory. Please see the ['Initalization' notebook](../Initialization/Initialization.ipynb) for more information." + "Some steps in this notebook require access to HST reference files, so we will create a temporary 'iref' directory for these reference files after download. This step is typically done by defining the 'iref' path in your bash profile so that all reference files for all datasets can be in one static location, but for the portability of this notebook we will create a directory. " ] }, { @@ -93,23 +119,34 @@ "outputs": [], "source": [ "os.environ['CRDS_SERVER_URL'] = 'https://hst-crds.stsci.edu'\n", - "os.environ['CRDS_PATH'] = os.path.abspath(os.path.join('.', 'reference_files'))\n", - "\n", - "os.environ['iref'] = os.path.abspath(os.path.join('.', 'reference_files', 'references', 'hst', 'wfc3')) + os.path.sep" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## 0. Example Data Download" + "os.environ['CRDS_SERVER'] = 'https://hst-crds.stsci.edu'\n", + "os.environ['CRDS_PATH'] = './crds_cache'\n", + "os.environ['iref'] = './crds_cache/references/hst/wfc3/'" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Lets find some example HST data from MAST and download it:" + "## 0. Example Data Download\n", + "\n", + "[Table of Contents](#toc)\n", + "\n", + "---\n", + "MAST queries may be done using `query_criteria`, where we specify:
\n", + "\n", + "    $\\rightarrow$ obs_id, proposal_id, and filters \n", + "\n", + "MAST data products may be downloaded by using `download_products`, where we specify:
\n", + "\n", + "    $\\rightarrow$ products = calibrated (FLT, FLC) or drizzled (DRZ, DRC) files\n", + "\n", + "    $\\rightarrow$ type = standard products (CALxxx) or advanced products (HAP-SVM)\n", + "____\n", + "\n", + "Let's find some example HST data from MAST and download it. The example used here is from visit 14 of program [16801](http://www.stsci.edu/cgi-bin/get-proposal-info?id=16801&observatory=HST). The associations IEPW14030 and IEPW14040 each contain two FLC images in the F336W and F225W filters and a single DRC combined image for each filter. Here we download the FLC, DRC, and ASN files using `astroquery`.\n", + "\n", + "
Depending on your connection speed this cell may take a few minutes to execute.
" ] }, { @@ -118,27 +155,22 @@ "metadata": {}, "outputs": [], "source": [ - "obsTable = Observations.query_criteria(project='HST', proposal_id='14689', obs_id='ID7307030')\n", + "obs_ids = ['IEPW14030', 'IEPW14040']\n", + "\n", + "obsTable = Observations.query_criteria(obs_id=obs_ids)\n", "products = Observations.get_product_list(obsTable)\n", - "filtered_products = Observations.filter_products(products, mrp_only=False, productSubGroupDescription=['FLC', 'ASN'])\n", - "filtered_products" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "dl_tbl = Observations.download_products(filtered_products, mrp_only=False)\n", - "dl_tbl" + "\n", + "data_prod = ['FLC', 'ASN', 'DRC'] # ['FLC', 'FLT', 'DRC', 'DRZ']\n", + "data_type = ['CALWF3'] # ['CALACS', 'CALWF3', 'CALWP2', 'HAP-SVM']\n", + "\n", + "Observations.download_products(products, productSubGroupDescription=data_prod, project=data_type, cache=True)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Now to make the paths easier to work with, lets move those files from their default download locations into the same location as this notebook:" + "Next, we retrieve the Hubble Advanced Product (HAP) headerlets, which we will use to change between different WCS solutions." ] }, { @@ -147,17 +179,14 @@ "metadata": {}, "outputs": [], "source": [ - "for f in dl_tbl['Local Path']:\n", - " filename = os.path.split(f)[-1]\n", - " if not os.path.exists(filename):\n", - " shutil.move(f, '.')" + "Observations.download_products(products, productSubGroupDescription=['HLET'], project=['HAP-SVM'], cache=True)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "And then download all of the necessary reference files for the images using CRDS (this uses the CRDS paths defined at the top of the notebook, although you would probably want to use a different path for your own data as described above):" + "Now to make the paths easier to work with, we move those files from their default download location into the notebook directory. In addition, we add one to the headerlet extension numbers because lists are zero indexed while the EXTVER's extensions are unity based. We do this by defining a small `correct_hdrlet_extvers()` function." ] }, { @@ -166,23 +195,30 @@ "metadata": {}, "outputs": [], "source": [ - "for row in filtered_products:\n", - " if row['productSubGroupDescription'] == 'FLC':\n", - " os.system('crds bestrefs --files {}_flc.fits --sync-references=1 --update-bestrefs'.format(row['obs_id']))" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## 1. New extensions on fits files" + "for fits_file in glob.glob('./mastDownload/HST/*/*.fits'):\n", + " fits_name = os.path.basename(fits_file)\n", + " os.rename(fits_file, fits_name)\n", + " \n", + "if os.path.exists('mastDownload'):\n", + " shutil.rmtree('mastDownload')" ] }, { - "cell_type": "markdown", + "cell_type": "code", + "execution_count": null, "metadata": {}, + "outputs": [], "source": [ - "Using `fits.info` prints basic information about the extensions in a fits file. In the following examples, we will show operations for one file, though performing the same operations for multiple files simply requires looping." + "def correct_hdrlet_extvers(filename):\n", + " \"\"\"Correctly renumbers hdrlet EXTVER values\"\"\"\n", + " with fits.open(filename, mode='update') as hdulist:\n", + " hdrlet_count = 0\n", + " for i, ext in enumerate(hdulist):\n", + " if ext.name == 'HDRLET':\n", + " hdrlet_count += 1\n", + " hdulist[i].header['EXTVER'] = hdrlet_count\n", + " else:\n", + " continue" ] }, { @@ -191,14 +227,15 @@ "metadata": {}, "outputs": [], "source": [ - "filename = 'id7307xfq_flc.fits'" + "for flc_file in sorted(glob.glob('*flc.fits')):\n", + " correct_hdrlet_extvers(flc_file)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "If no extensions named `HDRLET` appear, then solutions can be downloaded from the database (if available) and automatically appended to the fits files using the following command:" + "Now we can check the active WCS solution in the image header. If the image is aligned to a catalog, we list the number of matches and the fit RMS converted from milliarcseconds to pixels." ] }, { @@ -207,44 +244,92 @@ "metadata": {}, "outputs": [], "source": [ - "updatewcs.updatewcs(filename, use_db=True)" + "ext_0_keywords = ['DETECTOR', 'EXPTIME', 'FILTER'] # extension 0 keywords.\n", + "ext_1_keywords = ['WCSNAME', 'NMATCHES', 'RMS_RA', 'RMS_DEC'] # extension 1 keywords.\n", + "\n", + "# Define the detector plate scales in arcsec per pixel.\n", + "DETECTOR_SCALES = {\n", + " 'IR': 0.1283, \n", + " 'UVIS': 0.0396, \n", + " 'WFC': 0.05\n", + "}\n", + "\n", + "formatted_data = {}\n", + "column_data = defaultdict(list)\n", + "\n", + "for fits_file in sorted(glob.glob('*fl?.fits')):\n", + " column_data['filename'].append(fits_file)\n", + " header0 = fits.getheader(fits_file, 0)\n", + " header1 = fits.getheader(fits_file, 1)\n", + " \n", + " for keyword in ext_0_keywords:\n", + " column_data[keyword].append(header0[keyword])\n", + " for keyword in ext_1_keywords:\n", + " if keyword in header1:\n", + " if 'RMS' in keyword:\n", + " value = np.around(header1[keyword], decimals=1)\n", + " else:\n", + " value = header1[keyword]\n", + " column_data[keyword].append(value)\n", + " else:\n", + " column_data[keyword].append(np.nan)\n", + " \n", + " for keyword in ['RMS_RA', 'RMS_DEC']:\n", + " if keyword in header1:\n", + " rms_value = header1[keyword] / 1000 / DETECTOR_SCALES[header0['DETECTOR']]\n", + " column_data[f'{keyword}_pix'].append(np.round(rms_value, decimals=2))\n", + " else:\n", + " column_data[f'{keyword}_pix'].append(np.nan)\n", + "\n", + "wcstable = Table(column_data)\n", + "wcstable" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The updated solutions should then show up as extra `HDRLET` extensions" + "Here we see that the first two exposures (F336W) have a fit to 'Gaia eDR3' with 46 matches and a fit RMS of ~0.3 pixels. The next two exposures (F225W) do not have a catalog fit and use the 'a priori' correction to the Guide Star Catalog v2.4. In section 5, we show how to apply the SVM headerlet to align the F225W filter to Gaia eDR3." ] }, { - "cell_type": "code", - "execution_count": null, + "cell_type": "markdown", "metadata": {}, - "outputs": [], "source": [ - "fits.info(filename)" + "## 1. New extensions on FITS files\n", + "\n", + "[Table of Contents](#toc)\n", + "\n", + "Using `fits.info` prints basic information about the extensions in a FITS file. In the following examples, we show operations for one F336W `flc.fits` file, though the same operations can be repeated in a loop for multiple files. The updated solutions should then show up as extra `HDRLET` extensions." ] }, { - "cell_type": "markdown", + "cell_type": "code", + "execution_count": null, "metadata": {}, + "outputs": [], "source": [ - "As seen above, there are new `HDRLET` extensions in the fits files (compared to the pre-2019.3 products). These extensions each contain information used to construct a World Coordinate System (WCS), which is used to transform image coordinates into physical (sky) coordinates. Each of these WCS's represent an astrometric solution derived in a different way." + "filename = 'iepw14g4q_flc.fits'\n", + "\n", + "fits.info(filename)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## 2. Exploring different solutions" + "As seen above, there are new `HDRLET` extensions in the FITS files (as compared to the pre-2019.3 products. These extensions each contain information used to construct a World Coordinate System (WCS), which is used to transform image coordinates into physical (sky) coordinates. Each WCS represents a uniquely derived astrometric solution." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Each HDRLET extension contains information describing the solution used in its creation. To investigate this we first programmatically obtain the extension numbers of the HDRLETs." + "## 2. Exploring different solutions\n", + "\n", + "[Table of Contents](#toc)\n", + "\n", + "Each HDRLET extension contains information describing the solution used in its creation. To investigate this we first obtain the extension numbers of the HDRLETs." ] }, { @@ -255,15 +340,14 @@ "source": [ "ext_indices = headerlet.find_headerlet_HDUs(filename, strict=False)\n", "\n", - "# To show it's consistent with the fits.info from above\n", - "print(ext_indices)" + "print(ext_indices) # To show it's consistent with the fits.info from above." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We can then loop through these extensions to see what the WCS solutions are." + "We can then loop through these extensions to see what WCS solutions are available." ] }, { @@ -272,19 +356,18 @@ "metadata": {}, "outputs": [], "source": [ - "hdu = fits.open(filename)\n", - "print('Ext\\tWCSNAME')\n", - "for ext_ind in ext_indices:\n", - " print(ext_ind, '\\t', hdu[ext_ind].header['WCSNAME'])\n", + "with fits.open(filename) as hdu:\n", + " print('Ext\\tWCSNAME')\n", "\n", - "hdu.close()" + " for ext_ind in ext_indices:\n", + " print(ext_ind, '\\t', hdu[ext_ind].header['WCSNAME'])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Alternatively, we can use `get_headerlet_kw_names`:" + "Alternatively, we can use the `get_headerlet_kw_names()` function:" ] }, { @@ -311,18 +394,19 @@ "outputs": [], "source": [ "def get_hdrlet_wcsnames(filename):\n", + "\n", " \"\"\"Print and return list of WCS names in HDRLET extensions of fits file\"\"\"\n", - " hdu = fits.open(filename)\n", - " ext_indices = headerlet.find_headerlet_HDUs(filename, strict=False)\n", "\n", - " print('Ext\\tWCSNAME')\n", - " new_wcsnames = []\n", - " for ext_ind in ext_indices:\n", - " name = hdu[ext_ind].header['WCSNAME']\n", - " print(ext_ind, '\\t', name)\n", - " new_wcsnames.append(name)\n", - " \n", - " hdu.close()\n", + " with fits.open(filename) as hdu:\n", + " ext_indices = headerlet.find_headerlet_HDUs(filename, strict=False)\n", + "\n", + " print('Ext\\tWCSNAME')\n", + " new_wcsnames = []\n", + " for ext_ind in ext_indices:\n", + " name = hdu[ext_ind].header['WCSNAME']\n", + " print(ext_ind, '\\t', name)\n", + " new_wcsnames.append(name)\n", + "\n", " return new_wcsnames" ] }, @@ -357,32 +441,22 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The nature of each solution is described here: https://drizzlepac.readthedocs.io/en/latest/astrometry.html#interpreting-wcs-names" + "The nature of each solution is described here: https://drizzlepac.readthedocs.io/en/latest/mast_data_products/astrometry.html#interpreting-wcs-names. In some cases, single-visit mosaic (SVM) solution named FIT-SVM-GAIADR2 might be better than the default active solution of FIT-REL-GAIAeDR3." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## 3. Applying a headerlet to the science extensions" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "To apply/activate one of the other solutions, the `restore_from_headerlet()` function can be used. This function applies the WCS contained in a HDRLET extension to all SCI extensions of the image. Doing this requires knowing which solution should be applied, which can be obtained in multiple ways.\n", + "## 3. Applying a headerlet to the science extensions\n", "\n", - "
\n", + "[Table of Contents](#toc)\n", + "\n", + "To apply/activate one of the other solutions, we use the `restore_from_headerlet()` function. This applies the WCS contained in a HDRLET extension to all SCI extensions of the image. Doing this requires knowing which solution should be applied, which can be obtained in multiple ways. For instance, if the desired solution is `IDC_2731450pi-FIT_REL_GAIAeDR3`, we can find the `EXTVER` of the corresponding HDRLET from the list of wcs names we generated earlier.\n", + "\n", + "
\n", "NOTE: This is especially useful in cases where some of the exposures in a visit will have solutions that are aligned to Gaia, but others won't. This is true for grism images in the same visit as direct images, or shallow/deep exposure combinations.\n", - "
" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "For instance, if the desired solution is `IDC_2731450pi-GSC240`, we can find the `EXTVER` of the corresponding HDRLET from the list of wcs names we generated before." + "
" ] }, { @@ -391,10 +465,11 @@ "metadata": {}, "outputs": [], "source": [ - "# Gets the index of list element with value 'IDC_2731450pi-GSC240'\n", - "# The index in this list + 1 is the same as the EXTVER of the corresponding HDRLET\n", - "# We need to add 1 because lists are 0-indexed, while EXTVER's are 1 indexed\n", - "chosen_ext = new_wcsnames.index('IDC_2731450pi-GSC240') + 1" + "# Gets the index of list element with value 'IDC_2731450pi-GSC240'.\n", + "# The index in this list + 1 is the same as the EXTVER of the corresponding HDRLET.\n", + "# We need to add 1 because lists are 0-indexed, while EXTVER's are 1 indexed.\n", + "\n", + "chosen_ext = new_wcsnames.index('IDC_2731450pi-FIT_REL_GAIAeDR3')+1" ] }, { @@ -411,13 +486,7 @@ "metadata": {}, "source": [ "In this case we set `archive` keyword argument to `False`. Setting `archive` to True will preserve the currently active WCS as a new HDRLET extension on the file. Since in our case the current solution already has a HDRLET, we do not need to archive it. This may be useful in some cases, such as when the image has been manually aligned/transformed, and keeping a record of that solution is desired.\n", - "\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ + "\n", "We can check that the solution was applied:" ] }, @@ -428,7 +497,6 @@ "outputs": [], "source": [ "current_wcs = fits.getval(filename, 'WCSNAME', ext=('SCI', 1))\n", - "\n", "print(current_wcs)" ] }, @@ -446,15 +514,7 @@ "outputs": [], "source": [ "hdrlet_hdrnames = headerlet.get_headerlet_kw_names(fits.open(filename), 'HDRNAME')\n", - "desired_hdrname = hdrlet_hdrnames[new_wcsnames.index('IDC_2731450pi-GSC240')]" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ + "desired_hdrname = hdrlet_hdrnames[new_wcsnames.index('IDC_2731450pi-GSC240')]\n", "print(desired_hdrname)" ] }, @@ -471,7 +531,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We can also apply some logic to get the `hdrext` programatically. For instance, if we only wanted the `IDC` (distortion calibrated) solution with the `GSC240` tag (indicating that the guide star positions had been updated), we can do the following:" + "We can also apply some logic to get the `hdrext` programatically. For instance, if we only wanted the `IDC` (distortion calibrated) solution with the `GSC240` tag (indicating that this is a 'a priori' WCS where the guide star positions had been updated), we can do the following:" ] }, { @@ -499,34 +559,46 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## 4. Restoring the \"old\" solution" + "## 4. Changing to alternate WCS solutions\n", + "\n", + "[Table of Contents](#toc)\n", + "\n", + "Here we look at three WCS solutions and inspect which has the best alignment with respect to stars in the HST image.\n", + "\n", + "### 4.1 FIT-REL Gaia eDR3 solution\n", + "\n", + "When the WCS is `FIT_REL_eDR3`, the individual exposures are aligned to one another and then the entire association is aligned to Gaia eDR3. " ] }, { - "cell_type": "markdown", + "cell_type": "code", + "execution_count": null, "metadata": {}, + "outputs": [], "source": [ - "If the original solution is desired, it can be restored using the methods shown above, by replacing the WCSNAME simply with `IDC_2731450pi`, or whatever the name of the IDCTAB for that file is. To get that name, the following procedure can be done:\n" + "chosen_ext = new_wcsnames.index('IDC_2731450pi-FIT_REL_GAIAeDR3') + 1\n", + "headerlet.restore_from_headerlet(filename, hdrext=('HDRLET', chosen_ext), archive=False, force=False)\n", + "current_wcs = fits.getval(filename, 'WCSNAME', ext=('SCI', 1))\n", + "print(current_wcs)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "
\n", - "NOTE: Data taken after October 2017 may not have a solution of the form IDC_xxxxxxxxx, as the pointing information of the telescope was already calculated using GSC 2.4.0. As such, the \"old\" solution may be of the same form as the a priori solution, i.e.: IDC_xxxxxxxxx-GSC240.\n", - "
" + "### 4.2 \"a priori\" solution" ] }, { - "cell_type": "code", - "execution_count": null, + "cell_type": "markdown", "metadata": {}, - "outputs": [], "source": [ - "idc_file = fits.getval(filename, 'IDCTAB')\n", - "idc_filename = idc_file.split('$')[-1]\n", - "idc_wcsname = 'IDC_' + idc_filename.replace('_idc.fits', '')" + "When the WCS does not have the string `FIT`, but is appended with either `GSC240 or HSC30`, this is known as an a priori solution which simply corrects the coordinates of the guide stars in use at the time of observation to the coordinates of those stars as determined by Gaia, applying a global offset to the WCS.\n", + "\n", + "\n", + "
\n", + "NOTE: Data taken after October 2017 may not have an a priori solution, as the pointing information of the telescope was already calculated using GSC 2.4.0. As such, the \"old\" solution may be of the same form as the a priori solution, i.e.: IDC_xxxxxxxxx-GSC240.\n", + "
" ] }, { @@ -535,14 +607,19 @@ "metadata": {}, "outputs": [], "source": [ - "print(idc_wcsname)" + "chosen_ext = new_wcsnames.index('IDC_2731450pi-GSC240') + 1\n", + "headerlet.restore_from_headerlet(filename, hdrext=('HDRLET', chosen_ext), archive=False, force=False)\n", + "current_wcs = fits.getval(filename, 'WCSNAME', ext=('SCI', 1))\n", + "print(current_wcs)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Restoring the solution is then done the same as above:" + "### 4.3 \"distortion only\" solution\n", + "\n", + "If the original solution is desired, with no updates to the WCS and the original HST pointing information, it can be restored using the methods shown below, by replacing the WCSNAME simply with `IDC_2731450pi`, or whatever is the name of the IDCTAB reference file. " ] }, { @@ -551,16 +628,22 @@ "metadata": {}, "outputs": [], "source": [ - "chosen_ext = new_wcsnames.index(idc_wcsname) + 1" + "chosen_ext = new_wcsnames.index('IDC_2731450pi') + 1\n", + "headerlet.restore_from_headerlet(filename, hdrext=('HDRLET', chosen_ext), archive=False, force=False)\n", + "current_wcs = fits.getval(filename, 'WCSNAME', ext=('SCI', 1))\n", + "print(current_wcs)" ] }, { - "cell_type": "code", - "execution_count": null, + "cell_type": "markdown", "metadata": {}, - "outputs": [], "source": [ - "headerlet.restore_from_headerlet(filename, hdrext=('HDRLET', chosen_ext), archive=False, force=False)" + "## 5. Using downloaded SVM headerlets\n", + "\n", + "[Table of Contents](#toc)\n", + "\n", + "In cases like the example provided here, images from the same visit may have different WCS solution types (i.e. F336W is `FIT-REL-GAIAeDR3` while the F225W is `GSC240`).
\n", + "
However, we can apply the SVM headerlet solutions, which are derived from first relatively aligning the HST images to each other, and then aligning the group to an absolute reference catalog. Thus, they are often a better solution for datasets with a variety of filters/depths." ] }, { @@ -569,16 +652,14 @@ "metadata": {}, "outputs": [], "source": [ - "current_wcs = fits.getval(filename, 'WCSNAME', ext=('SCI', 1))\n", - "\n", - "print(current_wcs)" + "hlet_files = sorted(glob.glob('*hlet.fits'))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Alternatively, `updatewcs` can be run on the image, with the `use_db` flag set to `False`:" + "Let's look at WCS solutions in the headerlet for each image" ] }, { @@ -587,7 +668,18 @@ "metadata": {}, "outputs": [], "source": [ - "updatewcs.updatewcs(filename, use_db=False)" + "root_to_hlet_dict = {}\n", + "for hlet in hlet_files:\n", + " dest_image = fits.getval(hlet, 'DESTIM')\n", + " root_to_hlet_dict[dest_image] = hlet\n", + " print(hlet, dest_image, fits.getval(hlet, 'WCSNAME', 1))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we simply have to match each SVM headerlet to its corresponding flc file, and apply it." ] }, { @@ -596,30 +688,46 @@ "metadata": {}, "outputs": [], "source": [ - "current_wcs = fits.getval(filename, 'WCSNAME', ext=('SCI', 1))\n", - "\n", - "print(current_wcs)" + "for flc in sorted(glob.glob('*flc.fits')):\n", + " root = fits.getval(flc, 'rootname')\n", + " headerlet.apply_headerlet_as_primary(flc, hdrlet=root_to_hlet_dict[root], attach=True)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## 5. Inspecting alignment (optional)" + "## 6. Running `AstroDrizzle`\n", + "\n", + "[Table of Contents](#toc)\n", + "\n", + "Because the drizzling process is directly affected by the WCS's of the input FITS images, the WCS of the drizzled image cannot be changed as simply as shown above for FLC images. To use an astrometric solution (other than the one applied to the FLT/FLC at the time of drizzling), the images will have to be re-drizzled after activating the desired WCS. \n", + "\n", + "Here we query the association ID of each of the input files and add it to a dictionary." ] }, { - "cell_type": "markdown", + "cell_type": "code", + "execution_count": null, "metadata": {}, + "outputs": [], "source": [ - "While the solutions should improve the astrometry of the images, they may not have perfect absolute astrometry. To check the quality of the absolute astrometry. To do this, we can overplot the positions of the Gaia sources on the images. If the absolute astrometry is correct, the Gaia positions will project on top of the sources in the images. The WCS solutions that contain 'FIT-GAIA' in the WCSNAME should have the closest alignment, while the GSC-240 and baseline IDC solution will likely have small offsets." + "asn_dict = defaultdict(list)\n", + "\n", + "for flc in sorted(glob.glob('*flc.fits')):\n", + " asn_id = fits.getval(flc, 'asn_id')\n", + " if asn_id == 'NONE':\n", + " asn_id = fits.getval(flc, 'rootname')\n", + " asn_id = asn_id.lower()\n", + " asn_dict[asn_id].append(flc)\n", + "asn_dict" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We first need to get a listing of the Gaia sources:" + "Next, we prepare to drizzle the data, and we get some recommended values for drizzling from the MDRIZTAB reference file. The parameters in this file are different for each detector and are based on the number of input frames and the filter. These are a good starting point for drizzling and may be adjusted accordingly." ] }, { @@ -628,15 +736,35 @@ "metadata": {}, "outputs": [], "source": [ - "ast_tbl = generate_astrometric_catalog('id7307xfq_flc.fits', output=True) \n", - "ast_tbl" + "input_images_f336w = sorted(glob.glob('iepw14g[46]q_flc.fits'))\n", + "mdz = fits.getval(input_images_f336w[0], 'MDRIZTAB', ext=0).split('$')[1]\n", + "print('Searching for the MDRIZTAB file:', mdz)\n", + "get_mdriztab = os.system('crds sync --hst --files '+mdz+' --output-dir '+os.environ['iref'])" ] }, { - "cell_type": "markdown", + "cell_type": "code", + "execution_count": null, "metadata": {}, + "outputs": [], "source": [ - "We then need the WCS of the images, to project the Gaia positions onto the image coordinate frame." + "def get_vals_from_mdriztab(input_images_f336w, kw_list=['driz_sep_bits', \n", + " 'combine_type', \n", + " 'driz_cr_snr', \n", + " 'driz_cr_scale', \n", + " 'final_bits']):\n", + " \n", + " '''Get only selected parameters from the MDRIZTAB.'''\n", + " mdriz_dict = getMdriztabPars(input_images_f336w)\n", + " \n", + " requested_params = {}\n", + " \n", + " print('Outputting the following parameters:')\n", + " for k in kw_list:\n", + " requested_params[k] = mdriz_dict[k]\n", + " print(k, mdriz_dict[k])\n", + " \n", + " return requested_params" ] }, { @@ -645,68 +773,110 @@ "metadata": {}, "outputs": [], "source": [ - "hdu = fits.open(filename)\n", - "w = WCS(hdu['SCI', 1].header, hdu)\n", - "data = hdu['SCI', 1].data" + "selected_params = get_vals_from_mdriztab(asn_dict['iepw14030'])" ] }, { - "cell_type": "code", - "execution_count": null, + "cell_type": "markdown", "metadata": {}, - "outputs": [], "source": [ - "x, y = w.all_world2pix(np.array([ast_tbl['RA'], ast_tbl['DEC']]).T, 0).T" + "Here we see the recommended parameters for 2 input FLC frames. These can be modified below by uncommenting the lines below, as needed for optimal cosmic-ray rejection. For details, see the notebook [Aligning Multiple Visits](https://spacetelescope.github.io/hst_notebooks/notebooks/DrizzlePac/align_multiple_visits/align_multiple_visits.html) in this notebook repository. \n", + "\n", + "In the cell below, we run `AstroDrizzle` once for each filter using the association ID dictionary. " ] }, { "cell_type": "code", "execution_count": null, "metadata": { - "scrolled": false + "scrolled": true }, "outputs": [], "source": [ - "fig = plt.figure(figsize=(10, 5))\n", - "ax = plt.subplot(projection=w) # matplotlib has support for axes using WCS projections!\n", + "for asn_id in asn_dict:\n", + " \n", + " input_images = asn_dict[asn_id]\n", + " \n", + " # To override any of the above values:\n", + " # selected_params['driz_sep_bits'] = '256, 64, 16'\n", + " # selected_params['final_bits'] = '256, 64, 16'\n", + " # selected_params['combine_type'] = 'median'\n", + " # selected_params['driz_cr_snr'] = '4.0 3.5'\n", + " # selected_params['driz_cr_scale'] = '1.2 1.0'\n", "\n", - "ax.imshow(data, origin='lower',\n", - " norm=ImageNormalize(data, interval=PercentileInterval(99.65), stretch=LogStretch()))\n", - "ax.coords.grid(True, color='white', ls=':', alpha=.6)\n", - "ax.autoscale(False)\n", - "ax.scatter(x, y, edgecolors='r', facecolor=None, alpha=.8)" + " selected_params = get_vals_from_mdriztab(input_images)\n", + " \n", + " astrodrizzle.AstroDrizzle(input_images, \n", + " output=f'{asn_id}_updated_wcs',\n", + " preserve=False,\n", + " clean=True, \n", + " build=True,\n", + " context=False,\n", + " skymethod='match',\n", + " in_memory=True,\n", + " **selected_params)\n", + "clear_output()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The original (only IDC tab) WCS will likely have an offset between the Gaia positions and the sources. The GSC-240 solution offset is generally smaller, while any of the `GAIA-FIT` solutions should be well under a pixel. Using the code in section 3, the offsets can be seen for each available WCS." + "Next, we will display the default pipeline drizzled (DRC) image retrieved from MAST to show the astrometric offset for a zoomed in region of the image. We define the center and scaling to be the same for both sets." ] }, { - "cell_type": "markdown", + "cell_type": "code", + "execution_count": null, "metadata": {}, + "outputs": [], "source": [ - "## 6. Changing the WCS of drizzled images" + "center = [10.0095776, 40.5014080]\n", + "z = ZScaleInterval()" ] }, { - "cell_type": "markdown", + "cell_type": "code", + "execution_count": null, "metadata": {}, + "outputs": [], "source": [ - "Because the drizzling process is directly affected by the WCS's of the input images, the WCS of the drizzled image cannot be changed as simply as shown above for FLC images. To use an astrometric solution (other than the one applied to the FLT/FLC at the time of drizzling), the images will have to be re-drizzled after activating the desired WCS in the FLT/FLC images.\n", + "# Display the two science images and zoom in on an object to see the astrometric error.\n", + "image1 = 'iepw14030_drc.fits'\n", + "image2 = 'iepw14040_drc.fits'\n", "\n", - "
\n", - "NOTE: The images input to astrodrizzle should use the same WCS solution, or the drizzling process will produce poor results.\n", - "
" + "sci_image1 = fits.getdata(image1)\n", + "sci_image2 = fits.getdata(image2)\n", + "wcs_image1 = WCS(fits.getheader(image1, 1))\n", + "wcs_image2 = WCS(fits.getheader(image2, 1))\n", + "x1, y1 = wcs_image1.world_to_pixel_values([center])[0].astype(int)\n", + "x2, y2 = wcs_image2.world_to_pixel_values([center])[0].astype(int)\n", + "\n", + "fig = plt.figure(figsize=(10, 10))\n", + "ax1 = fig.add_subplot(1, 2, 1, projection=wcs_image1)\n", + "ax2 = fig.add_subplot(1, 2, 2, projection=wcs_image2)\n", + "\n", + "ax1.set_title('WCS: '+fits.getval(image1, 'WCSNAME', ext=('SCI', 1)))\n", + "ax2.set_title('WCS: '+fits.getval(image2, 'WCSNAME', ext=('SCI', 1)))\n", + "ax1.imshow(sci_image1, vmin=z.get_limits(sci_image1)[0], vmax=z.get_limits(sci_image1)[1]*5, cmap='Greys_r', origin='lower', interpolation='none')\n", + "ax2.imshow(sci_image2, vmin=z.get_limits(sci_image2)[0], vmax=z.get_limits(sci_image2)[1]*5, cmap='Greys_r', origin='lower', interpolation='none')\n", + "\n", + "ax1.set_xlim(x1-50, x1+50)\n", + "ax1.set_ylim(y1-50, y1+50)\n", + "ax2.set_xlim(x2-50, x2+50)\n", + "ax2.set_ylim(y2-50, y2+50)\n", + "ax1.grid(lw=1, color='white', ls=':')\n", + "ax2.grid(lw=1, color='white', ls=':')\n", + "plt.tight_layout()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "In order to get the same name for the drizzled file, we can simply run astrodrizzle on the association (asn) file we downloaded before." + "Here we see a small misalignment between the two filters in the pipeline drizzled files which have different WCS solutions.
\n", + "
\n", + "Finally, we display the redrizzled image which uses the improved FIT-SVM-GAIAeDR3 WCS, which restores the alignment." ] }, { @@ -715,16 +885,83 @@ "metadata": {}, "outputs": [], "source": [ - "astrodrizzle.AstroDrizzle('id7307030_asn.fits', mdriztab=True, build=True, clean=True, preserve=False)" + "# Display the two science images and zoom in on an object to see the astrometric error.\n", + "image1 = 'iepw14030_updated_wcs_drc.fits'\n", + "image2 = 'iepw14040_updated_wcs_drc.fits'\n", + "\n", + "sci_image1 = fits.getdata(image1)\n", + "sci_image2 = fits.getdata(image2)\n", + "wcs_image1 = WCS(fits.getheader(image1, 1))\n", + "wcs_image2 = WCS(fits.getheader(image2, 1))\n", + "x1, y1 = wcs_image1.world_to_pixel_values([center])[0].astype(int)\n", + "x2, y2 = wcs_image2.world_to_pixel_values([center])[0].astype(int)\n", + "\n", + "fig = plt.figure(figsize=(10, 10))\n", + "ax1 = fig.add_subplot(1, 2, 1, projection=wcs_image1)\n", + "ax2 = fig.add_subplot(1, 2, 2, projection=wcs_image2)\n", + "\n", + "ax1.set_title('WCS: '+fits.getval(image1, 'WCSNAME', ext=('SCI', 1)))\n", + "ax2.set_title('WCS: '+fits.getval(image2, 'WCSNAME', ext=('SCI', 1)))\n", + "ax1.imshow(sci_image1, vmin=z.get_limits(sci_image1)[0], vmax=z.get_limits(sci_image1)[1]*5, cmap='Greys_r', origin='lower', interpolation='none')\n", + "ax2.imshow(sci_image2, vmin=z.get_limits(sci_image2)[0], vmax=z.get_limits(sci_image2)[1]*5, cmap='Greys_r', origin='lower', interpolation='none')\n", + "\n", + "ax1.set_xlim(x1-50, x1+50)\n", + "ax1.set_ylim(y1-50, y1+50)\n", + "ax2.set_xlim(x2-50, x2+50)\n", + "ax2.set_ylim(y2-50, y2+50)\n", + "ax1.grid(lw=1, color='white', ls=':')\n", + "ax2.grid(lw=1, color='white', ls=':')\n", + "plt.tight_layout()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "# About this Notebook\n", + "\n", + "## Conclusions\n", + "\n", + "[Table of Contents](#toc)\n", + "\n", + "This notebook demonstrates how to access and apply different WCS solutions from exposure and SVM headerlets. In general, it is always preferred to have consistent WCS solutions across exposures, especially from the same visit. Users can also custom align their exposures to one another, as well as to external catalogs such as SDSS and Gaia. This process is detailed in the [align_to_catalogs](https://github.com/spacetelescope/hst_notebooks/blob/main/notebooks/DrizzlePac/align_to_catalogs/align_to_catalogs.ipynb) notebook." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "## About this Notebook\n", + " \n", + " Created: 14 Dec 2018; V. Bajaj\n", + " Updated: 31 May 2024; M. Revalski, V. Bajaj, & J. Mack\n", + "\n", + "**Source:** GitHub [spacetelescope/hst_notebooks](https://github.com/spacetelescope/hst_notebooks)\n", + "\n", + "\n", + "## Additional Resources\n", + "\n", + "Below are some additional resources that may be helpful. Please send any questions through the [HST Help Desk](https://stsci.service-now.com/hst), selecting the DrizzlePac category.\n", + "\n", + "- [WFC3 Website](https://www.stsci.edu/hst/instrumentation/wfc3)\n", + "- [WFC3 Data Handbook](https://hst-docs.stsci.edu/wfc3dhb)\n", + "- [WFC3 Instrument Handbook](https://hst-docs.stsci.edu/wfc3ihb)\n", + "\n", + "\n", + "## Citations\n", + "If you use Python packages such as `astropy`, `astroquery`, `drizzlepac`, `matplotlib`, or `numpy` for published research, please cite the authors.\n", + "\n", + "Follow these links for more information about citing various packages:\n", + "\n", + "* [Citing `astropy`](https://www.astropy.org/acknowledging.html)\n", + "* [Citing `astroquery`](https://github.com/astropy/astroquery/blob/main/astroquery/CITATION)\n", + "* [Citing `drizzlepac`](https://zenodo.org/records/3743274)\n", + "* [Citing `matplotlib`](https://matplotlib.org/stable/users/project/citing.html)\n", + "* [Citing `numpy`](https://numpy.org/citing-numpy/)\n", + "***\n", "\n", - " Updated: June 12, 2023 by A. O'Connor -- STScI WFC3 " + "[Top of Page](#top)\n", + "\"Space " ] } ], @@ -744,9 +981,9 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.10.10" + "version": "3.11.7" } }, "nbformat": 4, - "nbformat_minor": 2 + "nbformat_minor": 4 } From d0e880ca887f6df4d31646c57c0e22cbc3ab0ac1 Mon Sep 17 00:00:00 2001 From: Mitchell Revalski <56605543+mrevalski@users.noreply.github.com> Date: Fri, 14 Jun 2024 08:37:00 -0400 Subject: [PATCH 02/14] Updated drizzlepac readme (#283) * updated readme file * updated toc ordering for drizzlepac --- _toc.yml | 13 ++++--- notebooks/DrizzlePac/README.md | 64 +++++++++++++++++++++++++--------- 2 files changed, 53 insertions(+), 24 deletions(-) diff --git a/_toc.yml b/_toc.yml index 6e88afb04..ee0ad98ae 100644 --- a/_toc.yml +++ b/_toc.yml @@ -31,17 +31,16 @@ parts: chapters: - file: notebooks/DrizzlePac/README.md #- file: notebooks/DrizzlePac/Initialization/Initialization.ipynb + - file: notebooks/DrizzlePac/using_updated_astrometry_solutions/using_updated_astrometry_solutions.ipynb - file: notebooks/DrizzlePac/align_to_catalogs/align_to_catalogs.ipynb -# Notebook excluded from build until build failures are fixed (See SPB-1168) - - file: notebooks/DrizzlePac/align_mosaics/align_mosaics.ipynb - - file: notebooks/DrizzlePac/align_multiple_visits/align_multiple_visits.ipynb - file: notebooks/DrizzlePac/align_sparse_fields/align_sparse_fields.ipynb - #- file: notebooks/DrizzlePac/drizzle_wfpc2/drizzle_wfpc2.ipynb - - file: notebooks/DrizzlePac/mask_satellite/mask_satellite.ipynb + - file: notebooks/DrizzlePac/align_multiple_visits/align_multiple_visits.ipynb + - file: notebooks/DrizzlePac/use_ds9_regions_in_tweakreg/use_ds9_regions_in_tweakreg.ipynb - file: notebooks/DrizzlePac/optimize_image_sampling/optimize_image_sampling.ipynb + - file: notebooks/DrizzlePac/align_mosaics/align_mosaics.ipynb - file: notebooks/DrizzlePac/sky_matching/sky_matching.ipynb - - file: notebooks/DrizzlePac/use_ds9_regions_in_tweakreg/use_ds9_regions_in_tweakreg.ipynb - - file: notebooks/DrizzlePac/using_updated_astrometry_solutions/using_updated_astrometry_solutions.ipynb + - file: notebooks/DrizzlePac/mask_satellite/mask_satellite.ipynb + #- file: notebooks/DrizzlePac/drizzle_wfpc2/drizzle_wfpc2.ipynb - caption: HASP chapters: - file: notebooks/HASP/Setup/Setup.ipynb diff --git a/notebooks/DrizzlePac/README.md b/notebooks/DrizzlePac/README.md index 129d77448..55b521855 100644 --- a/notebooks/DrizzlePac/README.md +++ b/notebooks/DrizzlePac/README.md @@ -1,25 +1,55 @@ -# DrizzlePac Jupyter Notebook Tutorials +DrizzlePac Notebooks +================= +An updated set of HST drizzling and alignment tutorials are now available and compatible with the latest STScI distributed software environment [stenv](https://stenv.readthedocs.io/en/latest/). These notebooks include a new recommended workflow for MAST data retrieved after December 2019, which includes updated astrometric information included as additional FITS extensions. Alternatively, the new World Coordinate System (WCS) solutions may be downloaded directly from MAST as small 'headerlet' files and applied to existing data. For example, the Hubble Advanced Product 'Single Visit Mosaics' may have improved relative alignment for different filters acquired in the same visit. These headerlets may be used to update the WCS in the FITS images prior to drizzling. For details on the alignment of HST data in MAST, see Section 4.5 [Absolute Astrometry](https://hst-docs.stsci.edu/drizzpac/chapter-4-astrometric-information-in-the-header/4-5-absolute-astrometry) in the DrizzlePac Handbook. -Improved drizzling tutorials are now available as Jupyter Notebooks and compatible with the latest STScI distributed software as part of AstroConda. Prior drizzling examples were written for the DrizzlePac Handbook in 2012, just after MultiDrizzle was replaced, and supplemental examples were posted to the DrizzlePac Webpage in 2015 to support enhanced features in DrizzlePac 2.0. The new interactive notebooks consolidate information from these prior examples to form a more cohesive set, and any references to outdated software, such as PyRAF, have been removed and replaced with python functionality. +In each notebook, a sample WFC3 or ACS dataset is used to demonstrate how to download the calibrated data, inspect the quality of the alignment, and test whether the observations need to be realigned before combining the data with `AstroDrizzle`. Different workflows are illustrated to enhance the scientific value of the drizzled data products using advanced reprocessing techniques. These notebooks highlight different use cases, e.g. images acquired using small sub-pixel dithers to optimally sample the PSF versus those acquired in multiple pointings to generate large mosaics on the sky. -The notebooks contain live code and visualizations, along with the usual narrative text, making them an ideal training exercise for new users. Each tutorial includes blocks of code demonstrating how to download the calibrated data from the MAST archive, how to align frames and update the image world coordinate system, and how to enhance the scientific value of the drizzled data products using advanced reprocessing techniques. +The notebooks available in this repository include: -The eleven notebooks available in this repository include the following topics: +Alignment Workflows: +- [Improving Absolute and Relative Astrometry Using Alternate WCS Solutions](https://spacetelescope.github.io/hst_notebooks/notebooks/DrizzlePac/using_updated_astrometry_solutions/using_updated_astrometry_solutions.html) +- [Aligning HST images to an Absolute Reference Catalog](https://spacetelescope.github.io/hst_notebooks/notebooks/DrizzlePac/align_to_catalogs/align_to_catalogs.html) +- [Aligning Deep Exposures of Sparse Fields](https://spacetelescope.github.io/hst_notebooks/notebooks/DrizzlePac/align_sparse_fields/align_sparse_fields.html) +- [Aligning Multiple HST Visits](https://spacetelescope.github.io/hst_notebooks/notebooks/DrizzlePac/align_multiple_visits/align_multiple_visits.html) +- [Using DS9 Regions for Source Inclusion/Exclusion](https://spacetelescope.github.io/hst_notebooks/notebooks/DrizzlePac/use_ds9_regions_in_tweakreg/use_ds9_regions_in_tweakreg.html) -* Initializing DrizzlePac -* Aligning observations obtained in multiple HST visits -* Aligning HST images to an absolute reference catalog (e.g. GAIA, SDSS) -* Aligning sparse fields -* Improving alignment with DS9 exclusion regions -* Masking satellite trails in DQ arrays prior to drizzling -* Optimizing the image sampling for dithered datasets -* Drizzling WFPC2 data to use a single zeropoint -* Sky matching features for HST mosaics -* Aligning HST mosaics observed with multiple detectors -* Using the updated astrometry solutions based on _Gaia_ positions +Drizzling Features: +- [Optimizing the Image Sampling for Sub-pixel dithers](https://spacetelescope.github.io/hst_notebooks/notebooks/DrizzlePac/optimize_image_sampling/optimize_image_sampling.html) +- [Creating HST mosaics observed with multiple detectors](https://spacetelescope.github.io/hst_notebooks/notebooks/DrizzlePac/align_mosaics/align_mosaics.html) +- [Using Sky Matching features for HST mosaics](https://spacetelescope.github.io/hst_notebooks/notebooks/DrizzlePac/sky_matching/sky_matching.html) +- [Masking satellite trails prior to drizzling](https://spacetelescope.github.io/hst_notebooks/notebooks/DrizzlePac/mask_satellite/mask_satellite.html) -Additional tutorials will be added to the repository as new software functionality becomes available, especially for advanced use cases. For additional assistance with DrizzlePac tools, users may submit a ticket to the [STScI Help Desk](https://stsci.service-now.com/hst?id=hst_index) or send an email to help@stsci.edu. +For more information, see the [DrizzlePac Handbook](https://hst-docs.stsci.edu/drizzpac) and the [readthedocs](https://drizzlepac.readthedocs.io/en/latest/) software documentation. For additional assistance with DrizzlePac tools, users may submit a ticket to the [STScI Help Desk](https://stsci.service-now.com/hst?id=hst_index) and should select the DrizzlePac category. -Special thanks to the authors of these notebooks: J. Mack, S. Hoffmann, R. Avila, V. Bajaj, M. Cara, T. Desjardins, K. Huynh, B. Kuhn, C. Martlin, A. O’Connor, and C. Shanahan +Installation +------------ + +It is recommended to clone the entire repository. To do so, run the following command in a terminal: + +``` +git clone https://github.com/spacetelescope/hst_notebooks +``` + +`stenv` is the preferred base virtual environment for running WFC3 Noteboks since +it contains libraries necessary for processing and analyzing data from the Hubble +Space Telescope (HST) and the James Webb Space Telescope (JWST). To install, see +[stenv readthedocs](https://stenv.readthedocs.io/en/latest/) or +[stenv GitHub](https://github.com/spacetelescope/stenv). + +`hst_notebooks/notebooks_env` is the default virtual environment for HST Notebooks, +which contains the same scientific computing libraries in `stenv`, but not the HST and +JWST libraries. This environment can also be used as a base, but is not recommended. + +In addition, each notebook contains a `requirements.txt` file that needs to be +installed before running the notebooks. Here is a common set of commands to run +before executing the notebooks (assuming your virtual environment is activated): + +``` +pip install -r requirements.txt +pip install notebook +``` + +With the environment activated and additional libraries installed based on the +individual requirement files, you will be able to complete the notebooks. From 3d4a1f2d34fda422ff938f7da0e6b1b95ab9806e Mon Sep 17 00:00:00 2001 From: "M.Gough" <60232355+mgough-970@users.noreply.github.com> Date: Mon, 24 Jun 2024 09:52:15 -0400 Subject: [PATCH 03/14] Update ci_buildondemand.yml --- .github/workflows/ci_buildondemand.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/ci_buildondemand.yml b/.github/workflows/ci_buildondemand.yml index c10e4c79b..73df221ea 100644 --- a/.github/workflows/ci_buildondemand.yml +++ b/.github/workflows/ci_buildondemand.yml @@ -4,6 +4,6 @@ on: jobs: ExecuteNotebooks: - uses: spacetelescope/notebook-ci-actions/.github/workflows/ci_scheduled.yml@v4 + uses: spacetelescope/notebook-ci-actions/.github/workflows/ci_scheduled.yml@multi_dev with: python-version: ${{ vars.PYTHON_VERSION }} From 5b9105b1c49f50abb401ce5d862ee238b472131c Mon Sep 17 00:00:00 2001 From: Hatice Karatay <66814693+haticekaratay@users.noreply.github.com> Date: Thu, 27 Jun 2024 10:30:52 -0400 Subject: [PATCH 04/14] Test the rebuild of HTML when no nb present (#289) We experienced issues in the CI when a PR was created with no associated notebooks. As a result, the README failed to reflect the updates from the latest merge due to an execution failure. --- notebooks/DrizzlePac/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/notebooks/DrizzlePac/README.md b/notebooks/DrizzlePac/README.md index 55b521855..99fa42750 100644 --- a/notebooks/DrizzlePac/README.md +++ b/notebooks/DrizzlePac/README.md @@ -52,4 +52,4 @@ pip install notebook ``` With the environment activated and additional libraries installed based on the -individual requirement files, you will be able to complete the notebooks. +individual requirement files, you will be able to complete the notebooks. From b0b3d04f0fce32026beefdc0abaf4e719adac374 Mon Sep 17 00:00:00 2001 From: "M.Gough" <60232355+mgough-970@users.noreply.github.com> Date: Tue, 2 Jul 2024 11:34:13 -0400 Subject: [PATCH 05/14] Create ci_execute_single.yml --- .github/workflows/ci_execute_single.yml | 20 ++++++++++++++++++++ 1 file changed, 20 insertions(+) create mode 100644 .github/workflows/ci_execute_single.yml diff --git a/.github/workflows/ci_execute_single.yml b/.github/workflows/ci_execute_single.yml new file mode 100644 index 000000000..0217d2084 --- /dev/null +++ b/.github/workflows/ci_execute_single.yml @@ -0,0 +1,20 @@ +name: Manual Single File Execute w/OS matrix +on: + workflow_dispatch: + inputs: + filename: + description: 'Notebook file name:' + required: true + default: 'notebook.ipynb' + +jobs: + GenerateHTML: + uses: spacetelescope/notebook-ci-actions/.github/workflows/ci_execute_single.yml.yml@multi_dev + with: + python-version: ${{ vars.PYTHON_VERSION }} + filename: ${{ github.event.inputs.filename }} + secrets: + CASJOBS_PW: ${{ secrets.CASJOBS_PW }} + CASJOBS_USERID: ${{ secrets.CASJOBS_USERID }} + permissions: + contents: write From 7707560b0891d30f8d3cc10c4a7074012de0f36f Mon Sep 17 00:00:00 2001 From: "M.Gough" <60232355+mgough-970@users.noreply.github.com> Date: Tue, 2 Jul 2024 11:38:42 -0400 Subject: [PATCH 06/14] Update ci_execute_single.yml --- .github/workflows/ci_execute_single.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/ci_execute_single.yml b/.github/workflows/ci_execute_single.yml index 0217d2084..7a1ea6216 100644 --- a/.github/workflows/ci_execute_single.yml +++ b/.github/workflows/ci_execute_single.yml @@ -9,7 +9,7 @@ on: jobs: GenerateHTML: - uses: spacetelescope/notebook-ci-actions/.github/workflows/ci_execute_single.yml.yml@multi_dev + uses: spacetelescope/notebook-ci-actions/.github/workflows/ci_execute_single.yml@multi_dev with: python-version: ${{ vars.PYTHON_VERSION }} filename: ${{ github.event.inputs.filename }} From 4ab99a816ea8a98d32179edbfa36cc378c63b33f Mon Sep 17 00:00:00 2001 From: Hatice Karatay <66814693+haticekaratay@users.noreply.github.com> Date: Tue, 2 Jul 2024 15:55:34 -0400 Subject: [PATCH 07/14] Remove deprecated notebooks (#293) --- _config.yml | 1 - _toc.yml | 1 - .../Initialization/Initialization.ipynb | 337 ------------------ .../Initialization/requirements.txt | 2 - 4 files changed, 341 deletions(-) delete mode 100644 notebooks/DrizzlePac/Initialization/Initialization.ipynb delete mode 100644 notebooks/DrizzlePac/Initialization/requirements.txt diff --git a/_config.yml b/_config.yml index cb25f28a5..99e39594f 100644 --- a/_config.yml +++ b/_config.yml @@ -49,5 +49,4 @@ html: use_repository_button: true # Exclude notebooks that have as-yet unresolved build failures (see tickets SPB-1153SPB-1160, SPB-1168) exclude_patterns: [notebooks/DrizzlePac/drizzle_wfpc2/drizzle_wfpc2.ipynb, - notebooks/DrizzlePac/Initialization/Initialization.ipynb, notebooks/WFC3/dash/dash.ipynb] \ No newline at end of file diff --git a/_toc.yml b/_toc.yml index ee0ad98ae..f4569f005 100644 --- a/_toc.yml +++ b/_toc.yml @@ -30,7 +30,6 @@ parts: - caption: DrizzlePac chapters: - file: notebooks/DrizzlePac/README.md - #- file: notebooks/DrizzlePac/Initialization/Initialization.ipynb - file: notebooks/DrizzlePac/using_updated_astrometry_solutions/using_updated_astrometry_solutions.ipynb - file: notebooks/DrizzlePac/align_to_catalogs/align_to_catalogs.ipynb - file: notebooks/DrizzlePac/align_sparse_fields/align_sparse_fields.ipynb diff --git a/notebooks/DrizzlePac/Initialization/Initialization.ipynb b/notebooks/DrizzlePac/Initialization/Initialization.ipynb deleted file mode 100644 index caa377063..000000000 --- a/notebooks/DrizzlePac/Initialization/Initialization.ipynb +++ /dev/null @@ -1,337 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# DrizzlePac Initialization\n", - "\n", - "
This Jupyter notebook discusses the steps necessary to set up your computing environment to use DrizzlePac. This is the first step before using any of the other DrizzlePac tutorials. The code cells in this notebook can be used to partially confirm that your environment is properly configured for DrizzlePac before proceeding to the other tutorials.\n", - "\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Introduction\n", - "\n", - "DrizzlePac is a Python software package developed at STScI that is designed to align and combine HST images and is a successor to the older MultiDrizzle software. Since July 2012, all drizzled data products obtained from MAST are produced with AstroDrizzle. \n", - "\n", - "An abbreviation for Astrometric Drizzle, AstroDrizzle was designed from the ground-up to substantially improve the handling of distortion in the image header World Coordinate System. AstroDrizzle removes geometric distortion, corrects for sky background variations, flags cosmic-rays, and combines images with optional subsampling. Drizzled data products from MAST are generated for single visit associations only.\n", - "\n", - "To combine data from additional visits, TweakReg may be used to update the image WCS using matched source lists. Once the full set of images of a given target are properly aligned, they may be combined using AstroDrizzle.\n", - "\n", - "While the DrizzlePac software has been optimized to work with Hubble Space Telescope (HST) data, it can work with other types of data so long as the images adhere to the FITS standards for multi-extension files and for describing the World Coordinate System (WCS). It assumes that all distortions have been properly described in the WCS of the image, e.g. via the SIP distortion coefficients. [More details may be found here](http://www.stsci.edu/scientific-community/software/drizzlepac/features.html#h3-3-61c90abe-2d25-4c81-b5e0-450b9a59b17b) under the section 'Aligning to Non-HST Image'. \n", - "\n", - "In this notebook, we will demonstrate how to set up your environment to analyze HST data with links to several resources, as well as a demonstration of how to download observations from the HST archive and their associated reference files." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "
Before working with HST data, users are advised to consult both the data handbooks and the instrument team websites for the instrument of interest. Additional useful discussion of the drizzling algorithm and of how distortion information is represented in the image header may be found in the DrizzlePac Handbook.
" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "**Notes about HST files:**\n", - "\n", - "HST images are stored using the Flexible Image Transport System (FITS). In particular, HST data use a multi-extension FITS (MEF) format. In this format, science data, error arrays, and data quality information for a single observation are stored as different extensions within a single file. The data format for the detector you are working with may be found in your instrument's data handbook, e.g. for [WFC3](https://hst-docs.stsci.edu/display/WFC3IHB) and [ACS](https://hst-docs.stsci.edu/display/ACSIHB/ACS+Instrument+Handbook).\n", - "\n", - "\n", - "\n", - "**Recent changes to ACS/WFC and WFC3/UVIS data quality flags:**\n", - "\n", - "In early 2017, the ACS instrument team changed the definition of data quality (DQ) flags populated in the calibrated FLT/FLC files. New calibration techniques now make it possible discern between unstable and stable hot pixels, the later of which are corrected by ‘calacs’ when subtracting the dark. Thus, pixels identified as hot and stable (DQ flag=16) may now be treated as 'good' data when drizzling, and those identified as unstable (DQ flag=32) should be treated as 'bad'. A new MDRIZTAB reference table (16r12191j_mdz.fits) was delivered in June 2017 and contains a set of default parameters for combining exposures with AstroDrizzle. With changes to the DQ flag definitions, the parameters 'driz_sep_bits' and 'final_bits', which define DQ flags for drizzle to ignore (e.g. to treat as good), are now set to a value of 336 (the sum of 16+64+256) so that stable hot pixels, warm pixels, and full-well saturated pixels will not be rejected when combining exposures. For details, see [ACS ISR 2017-05](http://www.stsci.edu/hst/acs/documents/isrs/isr1705.pdf).\n", - "\n", - "The WFC3 instrument team implemented a similar change to the DQ flag definitions in December 2018, and an updated MDRIZTAB reference file (2ck18260i_mdz.fits) reflects the new recommended drizzle parameter settings such that DQ flag values 16, 64, and 256 are treated as good pixels. These new flags are valid for UVIS observations obtained after Nov 08 2012, when the dark calibration program began using post-flash to mitigate hot pixel trailing due to poor charge transfer efficiency at low background levels. A description of the new UVIS bad pixel tables is described in [WFC3 ISR 2018-15](http://www.stsci.edu/hst/wfc3/documents/ISRs/WFC3-2018-15.pdf).\n", - "\n", - "This new set of DrizzlePac notebooks takes into account the updated DQ parameter settings for processing both ACS/WFC and WFC3/UVIS data. Similar updates to WFC3/IR data quality flags will be implemented in early 2019." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "[Top of Page](#title_ID)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Required Software\n", - "\n", - "Anaconda is a method of managing python package installations in various environments and AstroConda adds STScI specific astronomy-related Python packages. Users unfamiliar with Anaconda/Astroconda should see the [documentation](https://astroconda.readthedocs.io/en/latest/). One must first install Anaconda and then they can use our Astroconda channel within Anaconda. **For most users**, the standard Python 3 software stack that **does not include IRAF/PyRAF** is appropriate and should be used for DrizzlePac. \n", - "\n", - "**Please be sure you have the latest version of AstroConda.** The AstroConda page on [updating your software stack](https://astroconda.readthedocs.io/en/latest/updating.html) goes into more detail, but you can update your base `conda` installation and *everything* installed in your AstroConda environment (assuming it is named \"astroconda\") by typing in a bash shell:\n", - "```\n", - "conda deactivate\n", - "conda update --all\n", - "conda update -n astroconda --all\n", - "conda activate astroconda\n", - "```\n", - "\n", - "**Please complete the previous step even if you have JUST installed AstroConda or Conda as it is necessary to ensure all updates.**\n", - "\n", - "For example, a newer change has made the `activate/deactivate` command above begin with `conda` instead of `source`. You should update everything to ensure you stay up-to-date with the software. \n", - "\n", - "In addition to the default AstroConda configuration, many DrizzlePac examples will use [astroquery](https://astroquery.readthedocs.io/en/latest/) to obtain data from the Mikulski Archive for Space Telescopes (MAST). To install this, type the following in your bash shell:\n", - "```\n", - "conda install -c conda-forge astroquery\n", - "```\n", - "The `astroquery.mast` API has [additional documentation](https://astroquery.readthedocs.io/en/latest/mast/mast.html) for reference.\n", - "\n", - "Many of the notebooks make use of `ImageFileCollections` in `ccdproc` to inspect the image header. To install this, type:\n", - "```\n", - "conda install -c conda-forge ccdproc\n", - "```\n", - "For each of the tutorials, a 'requirements.txt' file is present in the directory along with the notebook which lists any other package dependencies. " - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "
It should be noted that these notebooks are not tutorials for conda or python, but all steps needed to work with DrizzlePac are explained herein. There are in-depth introductions to conda available here and to python available here.
" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "[Top of Page](#title_ID)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Imports \n", - "\n", - "These imports are required for this particular notebook and are used for checking your system's setup.\n", - "\n", - "- `astroquery.mast Observations`: Establishes a connection to a server to query MAST. Please try re-running the cell if the connection fails.\n", - "- `os`: Python interface to the operating system.\n", - "- `shutil`: Python shell utilities.\n", - "- `stwcs`: HST world coordinate system (WCS) updates." - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "from astroquery.mast import Observations\n", - "import os\n", - "import shutil\n", - "import stwcs\n", - "import subprocess " - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "[Top of Page](#title_ID)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Retrieving Data from MAST\n", - "\n", - "The `astroquery.mast` API can be used to programatically retrieve data from the HST archive with the same kinds of filtering available through the MAST Portal. Here we show an example of how to retrieve a WFC3/UVIS observation of NGC104 by searching for the specific dataset name (`obs_id` in the search below). Note that we have set `obstype='all'` as some datasets may be classified as calibration if they were taken as part of an instrument calibration program even though they are perfectly useable for science. The default behavior is to search only for science observations." - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "obs_table = Observations.query_criteria(obs_id='ib2j02n5q', obstype='all')\n", - "download_tab = Observations.download_products(obs_table['obsid'], mrp_only=False, \n", - " productSubGroupDescription=['FLC'])" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Now let's move all of the files we just downloaded to the current working directory and remove the \"mastDownload/\" directory:" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "for file in download_tab['Local Path']:\n", - " os.rename(file, os.path.basename(file))\n", - " \n", - "shutil.rmtree('mastDownload')" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "[Top of Page](#title_ID)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Reference Files\n", - "\n", - "HST data require reference files that specify calibration information. `Drizzlepac` also needs various reference files in order to calibrate properly, so it's important to follow these instructions to allow `DrizzlePac` access to these needed calibration files. \n", - "\n", - "There are many types of reference files for each detector, so care should be taken to ensure that all necessary reference files for a particular dataset have been retrieved. The path to the reference files for each instrument (e.g., WFC3, ACS, WFPC2) is indicated with the appropriate environment variable. For the instruments supported by DrizzlePac, these are:\n", - "\n", - "- ACS = jref\n", - "- WFC3 = iref\n", - "- WFPC2 = uref\n", - "\n", - "The Calibration Reference Data System (CRDS) both stores the reference files and determines the mapping of reference files to observations. The `crds` tool included in AstroConda can find and download the best reference files for a particular observation. The [documentation](https://hst-crds.stsci.edu/static/users_guide/index.html) for `crds` describes many of the more advanced options, but we will demonstrate here how to obtain updated reference file information stored in the FITS header of an observation and also download those files to a local directory.\n", - "\n", - "First we need to set some environment variables:\n", - "- CRDS_SERVER_URL: Location of the CRDS server.\n", - "- CRDS_PATH: Path to where reference files will be downloaded." - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "os.environ['CRDS_SERVER_URL'] = 'https://hst-crds.stsci.edu'\n", - "os.environ['CRDS_PATH'] = os.path.abspath(os.path.join('.', 'reference_files'))" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "While the `crds.bestrefs` tool is also accessible inside of Python, it was designed with a command line interface in mind, therefore it is easiest to use it this way:" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "subprocess.check_output('crds bestrefs --files ib2j02n5q_flc.fits --sync-references=1 --update-bestrefs', shell=True, stderr=subprocess.DEVNULL)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Now that we have the reference files for this observation downloaded to a local directory called reference_files/, we need to tell the DrizzlePac software how to find these files. Our example dataset \"ib2j02n5q\" comes from WFC3, therefore we indicate the path to the associated reference files with the \"iref\" environment variable:" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "os.environ['iref'] = os.path.abspath(os.path.join('.', 'reference_files', 'references', 'hst', 'wfc3')) + os.path.sep" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Now the DrizzlePac software will be able to locate the reference files it requires in the case that the user needs to update the geometric distortion information in the image header. These files are the IDCTAB (Instrument Distortion Correction Table, `*idc.fits`), the D2IMFILE (Column Correction Reference File, `*d2i.fits`), and the NPOLFILE (Non-polynomial Offsets Reference File, `*npl.fits`). " - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "[Top of Page](#title_ID)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Update World Coordinate System\n", - "\n", - "Before combining images with `drizzlepac.astrodrizzle` to make mosaics, the data may first need to be processed with `stwcs.updatewcs`. This task is required for populating older (pre-AstroDrizzle Archive pipeline data) WFC3, ACS, STIS, and WFPC2 images with linear and polynomial distortion correction information in a format compatible with AstroDrizzle. It is also required for data that the user wishes to update to use a more recent (or custom) set of distortion reference files that were downloaded from MAST at previously. \n", - "\n", - "The `updatewcs` task will update the header keywords with new WCS information and apply several distortion corrections from reference files. In general, data recently retrieved from MAST will have already had this step performed and does **not** need to be run. \n", - "\n", - "**WARNING: If you re-run an instrument calibration pipeline on raw data, or your files were retrieved from MAST long ago, you must run `updatewcs` or you will encounter errors.**\n", - "\n", - "As an example, if it were necessary to run `updatewcs` on our WFC3/UVIS file, we can update the WCS as follows:" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "stwcs.updatewcs.updatewcs('ib2j02n5q_flc.fits', use_db=False)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Note that we have set `use_db=False` in our call to `updatewcs`. The Astrometric Database referenced in this call is a work in progress, and astrometric solutions are not yet available for all parts of the sky. The Astrometry Working Group at STScI has created this infrastructure in `updatewcs` to include multiple astrometric solutions as additional extensions to HST FITS files. The default behavior of `updatewcs` is `use_db=True`. In cases where it is left with the default value, warnings may appear while using the Astrometric Database with `updatewcs`, but your data are still properly prepared for `astrodrizzle`." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## About this Notebook\n", - "\n", - " Author: T. Desjardins, STScI ACS Team \n", - " Updated: December 14, 2018" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "[Top of Page](#title_ID)" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.5.6" - } - }, - "nbformat": 4, - "nbformat_minor": 2 -} diff --git a/notebooks/DrizzlePac/Initialization/requirements.txt b/notebooks/DrizzlePac/Initialization/requirements.txt deleted file mode 100644 index 34a969aee..000000000 --- a/notebooks/DrizzlePac/Initialization/requirements.txt +++ /dev/null @@ -1,2 +0,0 @@ -astroquery==0.4.6 -stwcs==1.7.2 From 9f39ab1e6ad79962adb6ac13ae5f0940a14eb7a7 Mon Sep 17 00:00:00 2001 From: Mitchell Revalski <56605543+mrevalski@users.noreply.github.com> Date: Wed, 3 Jul 2024 12:55:00 -0400 Subject: [PATCH 08/14] added psf notebook to readme --- notebooks/WFC3/README.md | 5 ++++- 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/notebooks/WFC3/README.md b/notebooks/WFC3/README.md index cd932856a..8d019c07b 100644 --- a/notebooks/WFC3/README.md +++ b/notebooks/WFC3/README.md @@ -4,7 +4,7 @@ WFC3 Notebooks WFC3 Notebooks is the primary repository for analyzing data from the [Wide Field Camera 3](https://www.stsci.edu/hst/instrumentation/wfc3) on the Hubble Space Telescope. The Jupyter notebooks include tools for general data analysis, -WFC3/IR time variable background (TVB), and photometry. This repository contains the +WFC3/IR time variable background (TVB), photometry, and point spread function (PSF) modeling. This repository contains the complementary notebooks mentioned in the [WFC3 Data Handbook](https://hst-docs.stsci.edu/wfc3dhb). These notebooks include: @@ -29,6 +29,9 @@ Photometry: - [Calculating WFC3 Zeropoints with `stsynphot`](https://spacetelescope.github.io/hst_notebooks/notebooks/WFC3/zeropoints/zeropoints.html) - [WFC3/UVIS Pixel Area Map Corrections for Subarrays](https://spacetelescope.github.io/hst_notebooks/notebooks/WFC3/uvis_pam_corrections/WFC3_UVIS_Pixel_Area_Map_Corrections_for_Subarrays.html) +Point Spread Function: + - [HST WFC3 Point Spread Function Modeling](https://spacetelescope.github.io/hst_notebooks/notebooks/WFC3/point_spread_function/hst_point_spread_function.html) + See the [WFC3 Instrument Handbook](https://hst-docs.stsci.edu/wfc3ihb), [WFC3 Data Handbook](https://hst-docs.stsci.edu/wfc3dhb), [wfc3tools](https://github.com/spacetelescope/wfc3tools), and From 183b811781f1a93970178270a79fe1f1ba796981 Mon Sep 17 00:00:00 2001 From: Mitchell Revalski <56605543+mrevalski@users.noreply.github.com> Date: Wed, 3 Jul 2024 14:27:45 -0400 Subject: [PATCH 09/14] Spelling Corrections (#295) * spelling corrections * pinned photutils==1.12.0 --- .../align_to_catalogs/align_to_catalogs.ipynb | 20 +++++++++---------- .../align_to_catalogs/requirements.txt | 1 + 2 files changed, 11 insertions(+), 10 deletions(-) diff --git a/notebooks/DrizzlePac/align_to_catalogs/align_to_catalogs.ipynb b/notebooks/DrizzlePac/align_to_catalogs/align_to_catalogs.ipynb index d81dffde4..d3a634551 100644 --- a/notebooks/DrizzlePac/align_to_catalogs/align_to_catalogs.ipynb +++ b/notebooks/DrizzlePac/align_to_catalogs/align_to_catalogs.ipynb @@ -126,7 +126,7 @@ "\n", "For this notebook, we will download WFC3/UVIS images of NGC 6791 from \n", "program [12692](http://www.stsci.edu/cgi-bin/get-proposal-info?id=12692&observatory=HST) \n", - "in the F606W filter acquired in Visit 01. The three FLC images have been processed by the HST WFC3 pipeline (calwf3), which includes bias subtraction, dark current and CTE correction, cosmic-ray rejection, and flatfielding:\n", + "in the F606W filter acquired in Visit 01. The three FLC images have been processed by the HST WFC3 pipeline (calwf3), which includes bias subtraction, dark current and CTE correction, cosmic-ray rejection, and flat-fielding:\n", "\n", " ibwb01xqq_flc.fits\n", " ibwb01xrq_flc.fits\n", @@ -555,7 +555,7 @@ " outshifts='Gaia_shifts.txt', # Name of the shift file that will be saved.\n", " wcsname=wcsname, # Give our WCS a new name defined above.\n", " reusename=True, # Overwrite the WCS if it exists.\n", - " ylimit=0.3,\n", + " ylimit=0.3, # The ylimit for the residuals plot.\n", " fitgeometry='rscale', # Allow for translation, rotation, and plate scaling.\n", " searchrad=0.1, # With many sources use a smaller search radius for convergence.\n", " updatehdr=False) # Don't update the header with new WCS until later.\n", @@ -689,7 +689,7 @@ "\n", "### 3.5 Overplot Matched Sources on the Image\n", "\n", - "Now, let's plot the sources that were matched between all images on the \"bottom\" UVIS chip. This is referred to as chip 2 or SCI, 1 or extension 1. The cell below reads in the `*_fit.match` file as an `astropy` table. Unfortunatley, it doesn't automatically name columns so we look at the header to grab the columns.\n", + "Now, let's plot the sources that were matched between all images on the \"bottom\" UVIS chip. This is referred to as chip 2 or SCI, 1 or extension 1. The cell below reads in the `*_fit.match` file as an `astropy` table. Unfortunately, it doesn't automatically name columns so we look at the header to grab the columns.\n", "\n", "To verify that TweakReg obtained an acceptable fit between matched source catalogs, it is useful to inspect the results before updating the image header WCS. In the figure below, sources that are matched with Gaia are overplotted on one of the input FLC frames with the two chips merged together. \n", "\n", @@ -715,8 +715,8 @@ "plt.imshow(fullsci, cmap='Greys', origin='lower', vmin=z1, vmax=z2)\n", "\n", "match_tab = ascii.read(rootname+'_flc_catalog_fit.match') # Load the match file in astropy table.\n", - "match_tab_chip1 = match_tab[match_tab['col15'] == 2] # Filter the table for sources on chip 1 (on ext 4).\n", - "match_tab_chip2 = match_tab[match_tab['col15'] == 1] # Filter he table for sources on chip 1 (on ext 4).\n", + "match_tab_chip1 = match_tab[match_tab['col15'] == 2] # Filter the table for sources on chip 1.\n", + "match_tab_chip2 = match_tab[match_tab['col15'] == 1] # Filter the table for sources on chip 2.\n", "x_cord1, y_cord1 = match_tab_chip1['col11'], match_tab_chip1['col12']\n", "x_cord2, y_cord2 = match_tab_chip2['col11'], match_tab_chip2['col12']\n", "\n", @@ -755,7 +755,7 @@ " outshifts='Gaia_shifts.txt', # Name of the shift file that will be saved.\n", " wcsname=wcsname, # Give our WCS a new name defined above.\n", " reusename=True, # Overwrite the WCS if it exists.\n", - " ylimit=0.3,\n", + " ylimit=0.3, # The ylimit for the residuals plot.\n", " fitgeometry='rscale', # Allow for translation, rotation, and plate scaling.\n", " searchrad=0.1, # With many sources use a smaller search radius for convergence.\n", " updatehdr=True) # Update the header with new WCS solution.\n", @@ -774,7 +774,7 @@ "\n", "While the three sets of FLC files are now aligned, in this example we drizzle together only the two long exposures. When exposures are very different lengths, drizzling them together doesn't work well when 'EXP' weighting is used. For objects that saturate in the long exposures, the problem occurs at the boundary where the signal transitions from only being present in short exposures near the core to being present at larger radii in the longer exposures. The result is a discontinuity in the PSF radial profile and a resulting flux that is too low in some regions. For photometry of saturated objects, the short exposures should be drizzled separately from the long exposures. \n", " \n", - "Next, we combine the images throught drizzling, and retrieve some recommended values for this process from the MDRIZTAB reference file. The parameters in this file are different for each detector and are based on the number of input frames. These are a good starting point for drizzling and may be adjusted based on your science goals." + "Next, we combine the images through drizzling, and retrieve some recommended values for this process from the MDRIZTAB reference file. The parameters in this file are different for each detector and are based on the number of input frames. These are a good starting point for drizzling and may be adjusted based on your science goals." ] }, { @@ -907,7 +907,7 @@ "\n", "[Table of Contents](#toc)\n", "\n", - "Many other services have interfaces for querying catalogs that can also be used to align HST images. In general, Gaia works very well for HST due to it's high precision, but can have a low number of sources in some regions, especially at high galactic latitudes. Aligning images to an absolute frame provides a way to make data comparable across many epochs/detectors/observatories, and in many cases, makes the alignment easier to complete." + "Many other services have interfaces for querying catalogs that can also be used to align HST images. In general, Gaia works very well for HST due to its high precision, but can have a low number of sources in some regions, especially at high galactic latitudes. Aligning images to an absolute frame provides a way to make data comparable across many epochs/detectors/observatories, and in many cases, makes the alignment easier to complete." ] }, { @@ -918,7 +918,7 @@ "## About this Notebook\n", " \n", " Created: 14 Dec 2018; V. Bajaj\n", - " Updated: 16 May 2024; M. Revalski, V. Bajaj, & J. Mack\n", + " Updated: 03 Jul 2024; M. Revalski, V. Bajaj, & J. Mack\n", "\n", "**Source:** GitHub [spacetelescope/hst_notebooks](https://github.com/spacetelescope/hst_notebooks)\n", "\n", @@ -965,7 +965,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.11.7" + "version": "3.11.9" } }, "nbformat": 4, diff --git a/notebooks/DrizzlePac/align_to_catalogs/requirements.txt b/notebooks/DrizzlePac/align_to_catalogs/requirements.txt index b89c9626e..d35b1acb6 100644 --- a/notebooks/DrizzlePac/align_to_catalogs/requirements.txt +++ b/notebooks/DrizzlePac/align_to_catalogs/requirements.txt @@ -5,4 +5,5 @@ drizzlepac==3.6.2 ipython==8.21.0 matplotlib==3.8.2 numpy==1.26.3 +photutils==1.12.0 jupyter==1.0.0 \ No newline at end of file From 07717d1b5d49d515950944cb687324c95e04ecc4 Mon Sep 17 00:00:00 2001 From: Mitchell Revalski <56605543+mrevalski@users.noreply.github.com> Date: Tue, 9 Jul 2024 12:37:17 -0400 Subject: [PATCH 10/14] pin photutils to 1.12.0 for compatibility (#296) * pin photutils to 1.12.0 for compatibility * modified comments to trigger CI * Revert changes made to align_multiple_visits and sky_matching --------- Co-authored-by: Hatice Karatay --- notebooks/DrizzlePac/align_mosaics/align_mosaics.ipynb | 7 ++++--- notebooks/DrizzlePac/align_mosaics/requirements.txt | 1 + .../DrizzlePac/align_multiple_visits/requirements.txt | 1 + .../align_sparse_fields/align_sparse_fields.ipynb | 2 +- notebooks/DrizzlePac/align_sparse_fields/requirements.txt | 1 + .../DrizzlePac/align_to_catalogs/align_to_catalogs.ipynb | 4 ++-- notebooks/DrizzlePac/drizzle_wfpc2/drizzle_wfpc2.ipynb | 4 ++-- notebooks/DrizzlePac/drizzle_wfpc2/requirements.txt | 1 + notebooks/DrizzlePac/mask_satellite/mask_satellite.ipynb | 5 +++-- notebooks/DrizzlePac/mask_satellite/requirements.txt | 1 + .../optimize_image_sampling/optimize_image_sampling.ipynb | 4 ++-- .../DrizzlePac/optimize_image_sampling/requirements.txt | 1 + notebooks/DrizzlePac/sky_matching/requirements.txt | 1 + .../use_ds9_regions_in_tweakreg.ipynb | 4 ++-- .../using_updated_astrometry_solutions/requirements.txt | 1 + .../using_updated_astrometry_solutions.ipynb | 2 +- 16 files changed, 25 insertions(+), 15 deletions(-) diff --git a/notebooks/DrizzlePac/align_mosaics/align_mosaics.ipynb b/notebooks/DrizzlePac/align_mosaics/align_mosaics.ipynb index 3c43891f3..97aa09cfd 100755 --- a/notebooks/DrizzlePac/align_mosaics/align_mosaics.ipynb +++ b/notebooks/DrizzlePac/align_mosaics/align_mosaics.ipynb @@ -13,7 +13,7 @@ "\n", "***\n", "\n", - "
This notebook assumes you have created and activated a virtual environment using the requirements file in this notebook's repository. Please make sure you have read the contents of the README file before continuing the notebook. Note that the text file \"align_mosaics_uvis_skyfile.txt\" is one of the downloads expected for this notebook.
\n", + "
This notebook requires creating and activating a virtual environment using the requirements file in this notebook's repository. Please also review the README file before using the notebook. Note that the text file \"align_mosaics_uvis_skyfile.txt\" is one of the downloads expected for this notebook.
\n", "\n", "## Learning Goals\n", "\n", @@ -21,7 +21,7 @@ "\n", "- Download WFC3 UVIS & IR images with `astroquery`\n", "- Check the active WCS (world coordinate system) solution in the FITS images\n", - "- Create a Gaia reference catalog and align the IR images using `TweakReg`.\n", + "- Create a Gaia reference catalog and align the IR images using `TweakReg`\n", "- Verify the quality of the alignment results and compare to the MAST alignment solutions\n", "- Update the WCS and use `AstroDrizzle` to combine the IR mosaic \n", "- Align the UVIS data to the IR drizzled mosaic using `TweakReg`\n", @@ -155,6 +155,7 @@ "\n", "from astroquery.gaia import Gaia\n", "from astroquery.mast import Observations\n", + "%config InlineBackend.figure_format = 'retina' # Improves the resolution of figures rendered in notebooks.\n", "Gaia.MAIN_GAIA_TABLE = 'gaiadr3.gaia_source' # Change if different data release is desired\n", "Gaia.ROW_LIMIT = 100000" ] @@ -1243,7 +1244,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.9.13" + "version": "3.11.7" } }, "nbformat": 4, diff --git a/notebooks/DrizzlePac/align_mosaics/requirements.txt b/notebooks/DrizzlePac/align_mosaics/requirements.txt index bb5ae429e..d2008a4d9 100644 --- a/notebooks/DrizzlePac/align_mosaics/requirements.txt +++ b/notebooks/DrizzlePac/align_mosaics/requirements.txt @@ -4,3 +4,4 @@ drizzlepac==3.5.1 ipython==8.11.0 matplotlib==3.7.0 numpy==1.23.4 +photutils==1.12.0 diff --git a/notebooks/DrizzlePac/align_multiple_visits/requirements.txt b/notebooks/DrizzlePac/align_multiple_visits/requirements.txt index 219f32f7e..9927ef4bf 100644 --- a/notebooks/DrizzlePac/align_multiple_visits/requirements.txt +++ b/notebooks/DrizzlePac/align_multiple_visits/requirements.txt @@ -3,3 +3,4 @@ astroquery==0.4.6 drizzlepac==3.5.1 ipython==8.11.0 matplotlib==3.7.0 +photutils==1.12.0 diff --git a/notebooks/DrizzlePac/align_sparse_fields/align_sparse_fields.ipynb b/notebooks/DrizzlePac/align_sparse_fields/align_sparse_fields.ipynb index 82c3fbe18..c4ca7d139 100755 --- a/notebooks/DrizzlePac/align_sparse_fields/align_sparse_fields.ipynb +++ b/notebooks/DrizzlePac/align_sparse_fields/align_sparse_fields.ipynb @@ -106,7 +106,7 @@ "from drizzlepac import tweakreg, astrodrizzle\n", "\n", "%matplotlib inline\n", - "%config InlineBackend.figure_format = 'retina'" + "%config InlineBackend.figure_format = 'retina' # Improves the resolution of figures rendered in notebooks." ] }, { diff --git a/notebooks/DrizzlePac/align_sparse_fields/requirements.txt b/notebooks/DrizzlePac/align_sparse_fields/requirements.txt index b89c9626e..d35b1acb6 100644 --- a/notebooks/DrizzlePac/align_sparse_fields/requirements.txt +++ b/notebooks/DrizzlePac/align_sparse_fields/requirements.txt @@ -5,4 +5,5 @@ drizzlepac==3.6.2 ipython==8.21.0 matplotlib==3.8.2 numpy==1.26.3 +photutils==1.12.0 jupyter==1.0.0 \ No newline at end of file diff --git a/notebooks/DrizzlePac/align_to_catalogs/align_to_catalogs.ipynb b/notebooks/DrizzlePac/align_to_catalogs/align_to_catalogs.ipynb index d3a634551..2c07826ee 100644 --- a/notebooks/DrizzlePac/align_to_catalogs/align_to_catalogs.ipynb +++ b/notebooks/DrizzlePac/align_to_catalogs/align_to_catalogs.ipynb @@ -105,7 +105,7 @@ "from drizzlepac import tweakreg, astrodrizzle\n", "from drizzlepac.processInput import getMdriztabPars\n", "\n", - "%config InlineBackend.figure_format = 'retina' # Greatly improves the resolution of figures rendered in notebooks.\n", + "%config InlineBackend.figure_format = 'retina' # Improves the resolution of figures rendered in notebooks.\n", "Gaia.MAIN_GAIA_TABLE = 'gaiadr3.gaia_source' # Change this to the desired Gaia data release.\n", "Gaia.ROW_LIMIT = 100000 # Show all of the matched sources in the printed tables.\n", "\n", @@ -965,7 +965,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.11.9" + "version": "3.11.7" } }, "nbformat": 4, diff --git a/notebooks/DrizzlePac/drizzle_wfpc2/drizzle_wfpc2.ipynb b/notebooks/DrizzlePac/drizzle_wfpc2/drizzle_wfpc2.ipynb index 5efd06600..53b27017c 100644 --- a/notebooks/DrizzlePac/drizzle_wfpc2/drizzle_wfpc2.ipynb +++ b/notebooks/DrizzlePac/drizzle_wfpc2/drizzle_wfpc2.ipynb @@ -11,7 +11,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "
Note: The notebook in this repository 'Initialization.ipynb' goes over many of the basic concepts such as the setup of the environment/package installation and should be read first if you are new to HST images, DrizzlePac, or Astroquery.
" + "
This notebook assumes you have created and activated a virtual environment using the requirements file in this notebook's repository. Please make sure you have read the contents of the README file before continuing the notebook.
" ] }, { @@ -537,7 +537,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.8.12" + "version": "3.11.7" }, "varInspector": { "cols": { diff --git a/notebooks/DrizzlePac/drizzle_wfpc2/requirements.txt b/notebooks/DrizzlePac/drizzle_wfpc2/requirements.txt index 2be567032..de2e4e914 100644 --- a/notebooks/DrizzlePac/drizzle_wfpc2/requirements.txt +++ b/notebooks/DrizzlePac/drizzle_wfpc2/requirements.txt @@ -3,6 +3,7 @@ astroquery==0.4.6 drizzlepac==3.5.1 matplotlib==3.7.0 numpy==1.23.4 +photutils==1.12.0 stsci.image==2.3.5 stsci.imagestats==1.6.3 stsci.skypac==1.0.9 diff --git a/notebooks/DrizzlePac/mask_satellite/mask_satellite.ipynb b/notebooks/DrizzlePac/mask_satellite/mask_satellite.ipynb index 169e317c5..b99981f36 100644 --- a/notebooks/DrizzlePac/mask_satellite/mask_satellite.ipynb +++ b/notebooks/DrizzlePac/mask_satellite/mask_satellite.ipynb @@ -117,7 +117,7 @@ "\n", "# configure the plot output\n", "%matplotlib inline\n", - "%config InlineBackend.figure_format = 'retina'" + "%config InlineBackend.figure_format = 'retina' # Improves the resolution of figures rendered in notebooks." ] }, { @@ -641,6 +641,7 @@ { "cell_type": "markdown", "metadata": { + "collapsed": true, "jupyter": { "outputs_hidden": true } @@ -1161,7 +1162,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.9.13" + "version": "3.11.7" } }, "nbformat": 4, diff --git a/notebooks/DrizzlePac/mask_satellite/requirements.txt b/notebooks/DrizzlePac/mask_satellite/requirements.txt index b4d487864..e7221a8e5 100644 --- a/notebooks/DrizzlePac/mask_satellite/requirements.txt +++ b/notebooks/DrizzlePac/mask_satellite/requirements.txt @@ -5,6 +5,7 @@ crds==11.17.15 drizzlepac==3.6.2 ipython==8.21.0 matplotlib==3.8.2 +photutils==1.12.0 jupyter==1.0.0 acstools==3.7.1 scikit-image==0.20.0 diff --git a/notebooks/DrizzlePac/optimize_image_sampling/optimize_image_sampling.ipynb b/notebooks/DrizzlePac/optimize_image_sampling/optimize_image_sampling.ipynb index f40bca2c8..31ee49783 100644 --- a/notebooks/DrizzlePac/optimize_image_sampling/optimize_image_sampling.ipynb +++ b/notebooks/DrizzlePac/optimize_image_sampling/optimize_image_sampling.ipynb @@ -100,7 +100,7 @@ "import drizzlepac\n", "\n", "%matplotlib inline\n", - "%config InlineBackend.figure_format = 'retina' # Greatly improves the resolution of figures rendered in notebooks.\n", + "%config InlineBackend.figure_format = 'retina' # Improves the resolution of figures rendered in notebooks.\n", "\n", "# Set the locations of reference files. and retrieve the MDRIZTAB recommended drizzle parameters.\n", "os.environ['CRDS_SERVER_URL'] = 'https://hst-crds.stsci.edu'\n", @@ -735,7 +735,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.11.9" + "version": "3.11.7" } }, "nbformat": 4, diff --git a/notebooks/DrizzlePac/optimize_image_sampling/requirements.txt b/notebooks/DrizzlePac/optimize_image_sampling/requirements.txt index b715672ee..3a39dd563 100644 --- a/notebooks/DrizzlePac/optimize_image_sampling/requirements.txt +++ b/notebooks/DrizzlePac/optimize_image_sampling/requirements.txt @@ -6,3 +6,4 @@ ipython==8.22.2 jupyter==1.0.0 matplotlib==3.8.4 numpy==1.26.4 +photutils==1.12.0 diff --git a/notebooks/DrizzlePac/sky_matching/requirements.txt b/notebooks/DrizzlePac/sky_matching/requirements.txt index 8d7d77f94..920a0035f 100644 --- a/notebooks/DrizzlePac/sky_matching/requirements.txt +++ b/notebooks/DrizzlePac/sky_matching/requirements.txt @@ -4,5 +4,6 @@ ccdproc==2.4.0 drizzlepac==3.5.1 ipython==8.11.0 matplotlib==3.7.0 +photutils==1.12.0 pandas==1.5.3 stwcs==1.7.2 diff --git a/notebooks/DrizzlePac/use_ds9_regions_in_tweakreg/use_ds9_regions_in_tweakreg.ipynb b/notebooks/DrizzlePac/use_ds9_regions_in_tweakreg/use_ds9_regions_in_tweakreg.ipynb index 0c100ea4f..e7014f6db 100644 --- a/notebooks/DrizzlePac/use_ds9_regions_in_tweakreg/use_ds9_regions_in_tweakreg.ipynb +++ b/notebooks/DrizzlePac/use_ds9_regions_in_tweakreg/use_ds9_regions_in_tweakreg.ipynb @@ -99,7 +99,7 @@ "\n", "# set plotting details for notebooks\n", "%matplotlib inline\n", - "%config InlineBackend.figure_format = 'retina' # Greatly improves the resolution of figures rendered in notebooks.\n", + "%config InlineBackend.figure_format = 'retina' # Improves the resolution of figures rendered in notebooks.\n", "plt.rcParams['figure.figsize'] = (20, 20)" ] }, @@ -738,7 +738,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.9.13" + "version": "3.11.7" } }, "nbformat": 4, diff --git a/notebooks/DrizzlePac/using_updated_astrometry_solutions/requirements.txt b/notebooks/DrizzlePac/using_updated_astrometry_solutions/requirements.txt index b89c9626e..d35b1acb6 100644 --- a/notebooks/DrizzlePac/using_updated_astrometry_solutions/requirements.txt +++ b/notebooks/DrizzlePac/using_updated_astrometry_solutions/requirements.txt @@ -5,4 +5,5 @@ drizzlepac==3.6.2 ipython==8.21.0 matplotlib==3.8.2 numpy==1.26.3 +photutils==1.12.0 jupyter==1.0.0 \ No newline at end of file diff --git a/notebooks/DrizzlePac/using_updated_astrometry_solutions/using_updated_astrometry_solutions.ipynb b/notebooks/DrizzlePac/using_updated_astrometry_solutions/using_updated_astrometry_solutions.ipynb index 5e2d53a9a..d41d74151 100644 --- a/notebooks/DrizzlePac/using_updated_astrometry_solutions/using_updated_astrometry_solutions.ipynb +++ b/notebooks/DrizzlePac/using_updated_astrometry_solutions/using_updated_astrometry_solutions.ipynb @@ -102,7 +102,7 @@ "from collections import defaultdict\n", "%matplotlib notebook\n", "%matplotlib inline\n", - "%config InlineBackend.figure_format = 'retina' # Greatly improves the resolution of figures rendered in notebooks." + "%config InlineBackend.figure_format = 'retina' # Improves the resolution of figures rendered in notebooks." ] }, { From de334e6e4f24b4bc8d3e7bb471a5ce0f201f48a2 Mon Sep 17 00:00:00 2001 From: "M.Gough" <60232355+mgough-970@users.noreply.github.com> Date: Fri, 12 Jul 2024 05:05:17 -0400 Subject: [PATCH 11/14] Create ci_security_scan.yml --- .github/workflows/ci_security_scan.yml | 10 ++++++++++ 1 file changed, 10 insertions(+) create mode 100644 .github/workflows/ci_security_scan.yml diff --git a/.github/workflows/ci_security_scan.yml b/.github/workflows/ci_security_scan.yml new file mode 100644 index 000000000..e162c07e5 --- /dev/null +++ b/.github/workflows/ci_security_scan.yml @@ -0,0 +1,10 @@ +name: Manual Security Scan +on: + workflow_dispatch: + #schedule: + #- cron: '0 3 * * *' # run at 2 AM UTC + # - cron: '0 0 * * 0' # midnight sunday UTC + +jobs: + Scheduled: + uses: spacetelescope/notebook-ci-actions/.github/workflows/ci_security_scan.yml@v3 From 97ad97524db684ed5584a1f5acd64bababdf677f Mon Sep 17 00:00:00 2001 From: "M.Gough" <60232355+mgough-970@users.noreply.github.com> Date: Wed, 24 Jul 2024 14:43:12 -0400 Subject: [PATCH 12/14] Update pre-requirements.sh --- notebooks/STIS/cross-correlation/pre-requirements.sh | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/notebooks/STIS/cross-correlation/pre-requirements.sh b/notebooks/STIS/cross-correlation/pre-requirements.sh index 4211058bc..499ae89fb 100755 --- a/notebooks/STIS/cross-correlation/pre-requirements.sh +++ b/notebooks/STIS/cross-correlation/pre-requirements.sh @@ -1 +1 @@ -conda install -y -c conda-forge hstcal==2.7.4 \ No newline at end of file +conda install -y -c conda-forge hstcal From 673e555d861d15b4b0ad6d823c5572d313b8ff57 Mon Sep 17 00:00:00 2001 From: "M.Gough" <60232355+mgough-970@users.noreply.github.com> Date: Wed, 24 Jul 2024 14:43:43 -0400 Subject: [PATCH 13/14] Update requirements.txt --- notebooks/STIS/cross-correlation/requirements.txt | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/notebooks/STIS/cross-correlation/requirements.txt b/notebooks/STIS/cross-correlation/requirements.txt index 52763d0a0..50281a50c 100644 --- a/notebooks/STIS/cross-correlation/requirements.txt +++ b/notebooks/STIS/cross-correlation/requirements.txt @@ -1,7 +1,7 @@ -astropy==5.3.3 -astroquery==0.4.6 -matplotlib==3.7.0 -numpy==1.23.4 -scipy==1.10.0 -stistools==1.4.4 +astropy>=5.3.3 +astroquery>=0.4.6 +matplotlib>=3.7.0 +numpy +scipy>=1.10.0 +stistools>=1.4.4 crds>=11.17 From 3f3454f1ebeac7d137e18cf8b704b661c46074c8 Mon Sep 17 00:00:00 2001 From: "M.Gough" <60232355+mgough-970@users.noreply.github.com> Date: Wed, 24 Jul 2024 16:41:28 -0400 Subject: [PATCH 14/14] Update requirements.txt --- notebooks/ACS/acs_reduction/requirements.txt | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/notebooks/ACS/acs_reduction/requirements.txt b/notebooks/ACS/acs_reduction/requirements.txt index 8c77068d7..fdb96373e 100644 --- a/notebooks/ACS/acs_reduction/requirements.txt +++ b/notebooks/ACS/acs_reduction/requirements.txt @@ -1,5 +1,5 @@ -astropy==5.3.3 -astroquery==0.4.6 -stwcs==1.7.2 -crds==11.17.0 -matplotlib==3.7.0 \ No newline at end of file +astropy>=5.3.3 +astroquery>=0.4.6 +stwcs>=1.7.2 +crds>=11.17.0 +matplotlib>=3.7.0