Public software associated with the challenge

Barnaby Rowe edited this page May 18, 2014 · 12 revisions

This page describes the pieces of software that were available on the great3-public repository throughout the GREAT3 challenge.

Preparing a submission

The pre-submission script presubmission.py enables participants to get their submissions in the required format. It can be downloaded from https://github.com/barnabytprowe/great3-public in the folder presubmission_script/ (if you're not familiar with GitHub, see this question on the FAQ for more information on possible ways to download it). Significant updates to this script, such as are necessary if the simulations get updated, will be announced on our mailing list, so if you're using the script you probably will want to sign up. Minor updates may occur without announcement, so we suggest that you make sure to use the latest version before submitting.

Checking a submission

In the same folder as the previous script, there is a submission-checker script called check_submission.py. This script can be used to check whether a given file constitutes a valid submission that can be submitted to the GREAT3 server and scored by the scripts that automatically process submissions. The outputs of our pre-submission script should of course pass this test; however, if you wish to prepare your submission into the required format yourself, you may wish to use the submission-checker to ensure that the format is correct.

Basic handling of multiepoch data

In a different folder in the repository, named example_scripts/, the script coadd_multiepoch.py provides a simple multiepoch imaging data coaddition code adapted to work with the GREAT3 data format. It is intended to provide a "first-order" imaging combination module to allow GREAT3 participants to compete in branches with multiepoch data without having to write their own equivalent code, and to provide a pedagogical example for those who might wish to create their own image combination module. It is not expected to be competitive with the current state-of-the-art in astrophysical image combination, but it should not be orders of magnitude worse, either.

Basic PSF modelling for variable_psf data

Also in example_scripts/, the script psf_models.py provides a simple, PCA-based PSF modelling code that can be used to create a model of the PSF, and images of the PSF from that model at the positions of GREAT3 galaxy images. Like the coadd_multiepoch.py script, it is intended to provide a "first-order" PSF module to allow GREAT3 participants to compete in branches with variable PSFs without writing their own PSF estimation module, and to provide an example for those who do wish to create their own PSF estimation module. It is not expected to be competitive with the current state-of-the-art in PSF estimation, but it should not be orders of magnitude worse, either.

Basic shear estimation for control, real_galaxy, multiepoch, and variable_psf data

The final two scripts in example_scripts/ are a pair of scripts to carry out basic shear estimation using a moments-based method in GalSim for any of the above 4 experiments. The script simple_shear.py can be used to estimate shears for all galaxies in a particular subfield, outputting a catalog of shear estimates, and the script mass_produce_shear.py can drive simple_shear.py to estimate shears for all subfields in a branch. The outputs of these scripts can be fed directly into the presubmission script. For multiepoch (variable_psf) branches, the shear estimation scripts require the outputs of the co-addition script (the PSF modeling script). For more information, please see the docstrings for this script, and you can also see the performance of these scripts on the GREAT3 leaderboard, for the team called GREAT3_EC.