Note
This technote is not yet published.
The LSST verify framework involves the creation of Measurements throughout the Stack. These Measurements must be delivered to the ap_verify framework so that it can manage them and forward them to SQuaSH. We would like to integrate the verify
code into the Stack in a way that minimizes the knowledge ap_verify
must have of pipeline implementation details and maximizes the maintainability of the individual Tasks that will be instrumented. This document describes the conventions the ap_verify
team has adopted to achieve both goals.
1 Metric definitions¶
All AP verification metrics shall be stored in the verify_metrics package, following that package’s conventions. Metrics and conventions for describing them are described on Confluence; in particular, all our metrics should be tagged ap_verify
for easy filtering and identification.
2 Data Flow¶
Each Task shall be responsible for the persistence of its own Measurements. As described in SQR-019, this may be done by creating a Job object and attaching Measurements and Task-specific metadata, or through output_quantities
.
Each Task must write its Measurements to disk in a location that will persist to at least the end of the pipeline run, and store the absolute path to the file in a metadata key named “<standard task prefix>.verify_json_path”. ap_verify
shall read any Measurement files mentioned in a Task’s metadata, including files associated with subTasks.
3 Instrumenting Tasks¶
TBD