Skip to content

Commit

Permalink
updates
Browse files Browse the repository at this point in the history
  • Loading branch information
MelissaGraham committed Oct 19, 2023
1 parent f14c20c commit 62922ed
Showing 1 changed file with 104 additions and 101 deletions.
205 changes: 104 additions & 101 deletions ap_spcs.tex
Original file line number Diff line number Diff line change
@@ -1,131 +1,134 @@
\section{Special Programs Processing Examples}\label{sec:spcs}
\section{Scenarios for Processing Data from Special Programs}\label{sec:spcs}

Step-by-step examples of the processing, either by RDM or users,
for data from several Special Programs.
\textbf{\emph{Hypothetical}} examples of Special Programs
and the regular, Special, and User-Generated processing
to illustrate what \emph{might be done}.

Basic steps that we use to describe a processing case study: \\
The details of the data acquisition and processing mentioned below are
\emph{just illustrative examples} of decisions that have yet to be made.

The steps used to describe the hypothetical processing for each case scenario are: \\
Step 1. Data acquisition. \\
Step 2. RDM prompt processing and alert production. \\
Step 3. RDM special processing with reconfigured pipelines. \\
Step 4. RDM inclusion in the WFD Data Release data products. \\
Step 5. User processing and user-generated data products. \\
Step 2. Regular Prompt processing and Alert Production. \\
Step 3. Special Processing with reconfigured pipelines. \\
Step 4. Regular processing for inclusion in WFD program data products. \\
Step 5. User-Generated Processing. \\

\subsection{Searching for TNOs with Shift-and-Stack}\label{ssec:SPCS_TNO}
\subsection{Outer solar system mini-survey}\label{ssec:SPCS_TNO}

This Special Programs processing summary is based on Becker et al. (2011)
white paper to find TNOs with shift-and stack (SAS) \citedsp{Document-11013}.
This hypothetical Special Programs processing summary is based on the Becker et al. (2011)
white paper to find outer solar system objects with shift-and stack (SAS) \citedsp{Document-11013}.

Step 1. Data acquisition. \\
The observational sequence is triggered.
In a single night, the 9 adjacent fields in a 3x3 grid are observed with
$336$ $\times$ $15$ second $r$-band exposures.
This sequence is always repeated 2-3 nights later.
This re-visit sequence is repeated 3 more times: 1.5 months, 3 months, and 13.5 months later.
Data obtained in the $g$-band filter is also acceptable.
$336$ $\times$ $15$ second $r$ or $g$-band exposures (168 standard visits).
These observations are repeated 2-3 nights later, and then this 2-night sequence
is repeated 3 more times: 1.5 months, 3 months, and 13.5 months later.
They are not all at the same RA, Dec, but at selected ecliptic coordinates.

Step 2. RDM prompt processing and alert production. \\
Step 2. Regular Prompt processing and Alert Production. \\
Each $2\times15$ second standard visit is processed by the Prompt pipeline
and alerts are released within 60 seconds.
Within 24 hours, the {\tt DiaSource} and {\tt DiaObject} catalogs are updated
to include the results of Prompt processing of these visits.
After 80 hours, the processed visit images and difference images become available.
All images and sources originating from this Special Program have
region and program labels, e.g., ``TNO Survey".
region and program labels, e.g., ``SP-OSSO".

Note that the results of prompt processing is not very relevant for this Special Program,
which requires a year of dispersed observations before the processing pipelines
for shift-and-stack can be run, but including them in prompt processing means
The results of Prompt processing are not very relevant for this Special Program's primary science goal,
which requires a year of dispersed observations before the processing pipelines for shift-and-stack can be run.
However, including these data in Prompt processing means that
they can contribute to LSST's other time-domain and Solar System science goals.

Step 3. RDM special processing with reconfigured pipelines. \\
Step 3. Special Processing with reconfigured pipelines. \\
None possible.
Shift-and-stack processing is beyond the scope of DM's algorithms.

Step 4. RDM inclusion in the WFD Data Release data products. \\
All standard visits are reprocessed and included in the annual Data Release.
Data products would include processed visit images and {\tt Source} and {\tt ForcedSource}
catalog rows, along with difference images and {\tt DiaSource} catalog rows.
All images and sources originating from this Special Program have
region and program labels, e.g., ``TNO Survey".
As with all Special Programs data, these visits might also be included in the
WFD deeply coadded images if RDM decides it is both possible and beneficial.

Step 5. User processing and user-generated data products. \\
The user-generated pipeline running the shift-and-stack processing would be set up and submitted
for batch processing by the user through the Science Platform or on an external processor.
Pipeline inputs would be the 336 processed exposures per field per re-visit sequence.
The LSST Science Pipelines difference imaging routine could be used with the same template tract/patch for all.
User-generated algorithms will shift the exposures and create difference images, and then the LSST
Science Pipelines could be used to stack and do source detection and characterization,
and generate an object database.
Custom code will derive orbital parameters for the detections (or perhaps interface with the MPC),
and store them in a separate {\tt SSObjects}-like database.


\subsection{Searching for Supernovae in Deep Drilling Fields}\label{ssec:SPCS_SNDDF}
Shift-and-stack processing is beyond the scope of existing algorithms in the LSST Science Pipelines.

Step 4. Regular processing for inclusion in WFD program data products. \\
Every year, each $2\times15$ second standard visit is reprocessed by the DIA data release pipelines
and the results are included alongside WFD program data in the relevant DIA data products
(e.g., processed visit images, difference images, associated source catalogs).
In the first year after the Special Program is executed,
Rubin Data Management finds that 10% of the standard visits from this Special Program
had coordinates and image quality that help improve uniformity of the all-sky coadd,
and so they are included.
In later years, this fraction decreases (remember, this is \emph{hypothetical}).
In all data releases, any and all processed images and catalog sources that originate in visits from this Special Program
have the same region and program labels, e.g., ``SP-OSSO".

Step 5. User-Generated Processing. \\
The User-Generated Processing pipeline running the shift-and-stack processing is be set up and submitted
for batch processing by the user through the Science Platform or on an external system.
The pipeline's inputs are the processed visit images (and/or difference images) from Prompt processing.
User-generated custom algorithms then shift-and-stack the images, and then the LSST Science Pipelines
tasks are used to do source detection and characterization and create catalogs.
User-generated custom code derives orbital parameters for the detections, and stores
them in a user-generated catalog with a similar format to {\tt SSObjects}.


\subsection{Deep Drilling Field}\label{ssec:SPCS_SNDDF}

Step 1. Data acquisition. \\
On a single deep drilling field, the scheduler obtains e.g., 5, 10, 10, 9, and 10 standard visits
with $2\times15$ second exposures in $grizy$ (or similar for the night's filter set)
and a small dither pattern between visits.

Step 2. RDM prompt processing and alert production. \\
As in the example above, but all images and sources originating from this Special Program would have
region and program labels of, e.g., ``DDF Name".

Step 3. RDM special processing with reconfigured pipelines. \\
The required data products for this science goal can be met by reconfiguring the DM pipelines.
First, a template image of appropriate depth for ``nightly" difference imaging
would be made using RDM stacking algorithms.
On nights when this DDF is observed, at the end of the sequence of observations,
reconfigured RDM algorithms would be automatically triggered to create a nightly deep stack,
PSF-match it with the appropriately-made template,
create a nightly deep difference image, run source detection on it, and update separate
{\tt DiaObject}- and {\tt DiaSource}-like catalogs.
All of this could happen in, e.g., 24 hours, but note there is no requirement on this timescale.
On an annual basis, deeply coadded images that include all of the DDF's visits to dates would
be generated, along with a separate {\tt Object}-like catalog for the DDF.

Step 4. RDM inclusion in the WFD Data Release data products. \\
As in the example above, but all images and sources originating from this Special Program would have
region and program labels of, e.g., ``DDF Name".

Step 5. User processing and user-generated data products. \\
For the science goal of searching for supernovae in nightly stacked DDF images,
no separate user-generated processing or data products appear to be necessary.


\subsection{A Twilight Survey with Short Exposures}\label{ssec:SPCS_Twilight}

Several kinds of twilight surveys with short exposures have been or might be proposed,
to put brighter stars (or transients such as supernovae) that saturate in a $15$ second image onto the
LSST photometric system and/or to take observations that are particularly well suited
(e.g., are 60 degrees from the sun) for finding Near-Earth Objects (NEOs).
The processing case study for these is currently limited by unknowns.
In the COSMOS DDF, the scheduler obtains 10 standard visits in a row in each of the $griz$ filters
with a small dither pattern between visits.
This happens every other night for a three month season for four years.

Step 2. Regular Prompt processing and Alert Production. \\
Same as above, but the program label would be, e.g., ``SP-DDF-COSMOS".

Step 3. Special Processing with reconfigured pipelines. \\
First, a template image of appropriate depth for ``nightly" difference imaging is created.
At the end of each nightly sequence of observations, a pipeline based on recongfigured
components of the LSST Science Pipelines is automatically triggered.
This pipeline creates nightly coadds in each filter and runs DIA using the template.
Alerts are \emph{not} produced, but unique and separate catalogs with the same format
as {\tt DiaObject} and {\tt DiaSource} are updated within 24 hours (not a requirement).
At the end of each season, deeply coadded images that include all of the DDF's visits
from all years are re-generated, along with a separate {\tt Object}-like catalog.
All images and catalogs are stored in separate butler collections and TAP tables from
the WFD program data products.

Step 4. Regular processing for inclusion in WFD program data products. \\
Every year, each standard visit is reprocessed by the DIA data release pipelines
and the results are included alongside WFD program data in the relevant DIA data products
(e.g., processed visit images, difference images, associated source catalogs).
Due to their small dither and lack of rotation, not even a single DDF image
is used to supplement the WFD program's all-sky coadd.

Step 5. User-Generated Processing. \\
In order to achieve a secondary science goal of finding very high-$z$ faint supernovae,
a team of users reconfigure the LSST Science Pipelines to create weekly deep coadds
of the COSMOS field an appropriate-depth template image, and to run DIA at the
end of the season.
These data products are stored in separate catalogs with the same format and schema as
the {\tt DiaSource} and {\tt DiaObject} tables that are private to the team.


\subsection{Short-exposure twilight survey}\label{ssec:SPCS_Twilight}

Twilight observations obtained at, e.g., 60 degrees from the Sun, are particularly
well-suited for finding Near-Earth Objects (NEOs).

Step 1. Data acquisition. \\
At a specified time (or e.g., 6 degree twilight), the scheduler begins dither pattern of short exposures.
Location and exposure times are set by the sky brightness and desired saturation limits.
At a specified time (or e.g., 6 degree twilight), the scheduler begins a dither pattern of
$2$-second exposures.
Coordinates and exposure times are set by the Sun distance, sky brightness, and desired saturation limits.

Step 2. RDM prompt processing and alert production. \\
Pending studies of short-exposure suitability for DIA (see Section \ref{sec:procbounds})
these data could {\it potentially} be processed by the Prompt pipeline and lead to alerts.
Step 2. Regular Prompt processing and Alert Production. \\
Pending studies of DIA and Alert Production pipeline capabilities to process
short-exposure, high sky-background images (see Section~\ref{ssec:proc_bounds_processing}).

Step 3. RDM special processing with reconfigured pipelines. \\
So long as the short-exposure images can be processed and have enough stars for photometric
and astrometric calibration, reconfigured DM pipelines will probably be sufficient for
creating image and catalog products from this kind of data.
Step 3. Special Processing with reconfigured pipelines. \\
Pending studies of the LSST Science Pipelines capabilities to process
short-exposure, high sky-background images (see Section~\ref{ssec:proc_bounds_processing}).

Step 4. RDM inclusion in the WFD Data Release data products. \\
These short-exposure, high sky background images would not contribute to the DRP data products created for the WFD survey.
Step 4. Regular processing for inclusion in WFD program data products. \\
These short-exposure, high sky background images would not contribute to the data products created for the WFD program.

Step 5. User processing and user-generated data products. \\
Step 5. User-Generated Processing. \\
If short-exposure images cannot be processed with the existing DM algorithms,
user-generated processing would be needed to reduce the raw data.
user-generated processing would be needed to reduce the raw data, and to
futher detect and characterize sources in the processed images.


\textbf{Side note:} A short-exposure survey of the bright stars of M67, described in Chapter 10.4 of the
Observing Strategy White Paper \citep{2017arXiv170804058L}, suggests using the stretch goal of
0.1 second exposures or, if that is not possible, \textit{"custom pixel masks to accurately perform
photometry on stars as much as 6 magnitudes brighter than the saturation level"}.
This would need user-generated processing.

0 comments on commit 62922ed

Please sign in to comment.