# terraref/computing-pipeline

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

# Convert hyperspectral exposure image to reflectance #88

Closed
opened this Issue Apr 14, 2016 · 50 comments

Projects
Contributor

### czender commented Apr 14, 2016 • edited by dlebauer

This is a draft algorithm to retrieve spectral reflectance based on my understanding of the (possibly soon-to-be) available inputs. Suggestions and correction welcome (the more specific, the better). For simplicity, the algorithm description currently omits the time dimension. It is implicit in all quantities below except rfl_wht.

NOTE: proposal below has been migrated to documentation which supports Latex for ease of reading.
https://terraref.gitbooks.io/terraref-documentation/content/hyperspectral_data.html But the commenting isn't that great,

So please comment below on this issue, or propose changes to the algorithm text as a pull request: https://github.com/terraref/documentation/blob/master/hyperspectral_data.md

## Inputs and Outputs

syntax: Variable(dimensions) [units]
Contributor

### czender commented Nov 3, 2016

 The above script reduces the ~10 GB raw image files to a series of area-averaged white and dark reference data that we actually use in the HS workflow. xps_tm is the exposure time in ms. The minimal files to retain are these 17 kb files: vnir_wht_avg_${xps_tm}.nc vnir_drk_avg_${xps_tm}.nc the img files are essentially the same ~10 GB as the raw files. the cut files are each ~50 MB and contain only the portion of the image that contains the target. These files are averaged to produce the avg values used in the calibration. Keep as much of this as @dlebauer and @solmazhajmohammadi would like for provenance reasons.
Contributor

### craig-willis commented Nov 3, 2016

 Thanks for the script details. I'm just now reading through this full thread and have a few questions/clarifications. It sounds like the dark/white reference data will need to be collected periodically. If so, @czender will your script need to be re-run? @solmazhajmohammadi You mention that the data is on the cache server but it's also uploaded to Google Drive. If this data will be collected more than once, should we define a location to store it long term (e.g., /sites/ua-mac/calibration-data/)? @czender It seems that the area-averaged values are ultimately what you need. At this point, where are you storing (or planning to store) the output files?

### solmazhajmohammadi commented Nov 3, 2016

 @craig-willis yes sure it makes sense to collect them separately for now.
Contributor

### czender commented Nov 3, 2016

 I'm planning to store the area-averaged files in the hyperspectral workflow script directory, i.e., the same directory as hyperspectral_workflow.sh, so they can easily be found and modified until they are stable. It would be good to have a location not in the scripts directory for the other files.
Contributor

### czender commented Nov 14, 2016

 @yanliu-chn please update the NCO build/module on roger to 4.6.2-beta03 which contains some new features helpful to the hyperspectral calibration. Thank you.

### yanliu-chn commented Nov 14, 2016

 Done. set default to 4.6.2-beta03. To use: $module purge$ module load gdal-stack-2.7.10 nco $echo$NCO_HOME /sw/nco-4.6.2-beta03
Contributor

### czender commented Nov 14, 2016

 @max-zilla or @yanliu-chn @craig-willis hyperspectral workflow now requires that the eight VNIR calibration *.nc files just added to HS scripts directory reside in the same directory as hyperspectral_workflow.sh. Since they all are in the same git directory, this should be automatic. But if extractors ever separate them, some paths will need to be modified. HS workflow now requires NCO 4.6.2-beta03 or later. Maybe don't pull the new stuff until/unless your sure these requirements are met.

### solmazhajmohammadi commented Dec 2, 2016

 Please find the white spectralon measurement using SWIR camera here: https://drive.google.com/drive/folders/0B9h5V5JdLLXmRHZJN0d1VXJLNVE?usp=sharing Each folder contains one line of dark reference measurement. The raw file is the scan of spectralon target and 8 calibrated color targets on the top of the spectralon target, please choose your region of interest only from white target. Please note that you apply same procedure for SWIR camera as well (subtract the dark measurement from data and white reference).
Contributor

### czender commented Dec 2, 2016 • edited by rachelshekar

 I'll defer this until the camera is repaired and someone affirms that the SWIR spectralon measurements apply to the repaired camera. - terraref/reference-data#50
Contributor

### max-zilla commented Dec 14, 2016

 Just updated the extractor for NCO 4.6.2-beta03 FYI.

### yanliu-chn commented Dec 14, 2016 • edited by dlebauer

 Saw the issue. Thanks! This is consistent with the ROGER nco deployment. Charlie has now pushed a few new releases, let’s update when the next official version comes out. Thanks! -Yan

Member

### dlebauer commented Jan 5, 2017 • edited

 closing per suggestion from @czender - other issues e.g. #208 (uncertainty, reflectance, etc) cover outstanding calibration issues

Closed

### dlebauer referenced this issue Feb 22, 2017

Closed

#### Collect diurnal hyperspectral images for calibration #263

0 of 1 task complete

### czender referenced this issue Mar 2, 2017

Closed

#### Develop new calibration procedure for Hyperspectral Imagers #18

0 of 15 tasks complete

### dlebauer referenced this issue Mar 8, 2017

Open

#### Develop new calibration procedure for Hyperspectral Imagers #281

4 of 17 tasks complete

### czender referenced this issue Mar 9, 2017

Open

#### New hyperspectral calibration algorithm documentation #282

1 of 3 tasks complete

### solmazhajmohammadi referenced this issue Jun 22, 2017

Open

#### Dark Reference Measurement for SWIR Sensor #151

to join this conversation on GitHub. Already have an account? Sign in to comment