Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge branch 'develop' into feature/hdf4_subdatasets
- Loading branch information
Showing
32 changed files
with
942 additions
and
367 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,24 @@ | ||
@echo off | ||
call bin\set_local_conda_path.bat | ||
call bin\fix_hardcoded_absolute_paths.bat | ||
call bin\activate_podpac_conda_env.bat | ||
|
||
cd podpac | ||
echo "Updating PODPAC" | ||
git fetch | ||
for /f %%a in ('git describe --tags --abbrev^=0 origin/master') do git checkout %%a | ||
cd .. | ||
echo "Updating PODPAC EXAMPLES" | ||
cd podpac-examples | ||
git fetch | ||
for /f %%a in ('git describe --tags --abbrev^=0 origin/master') do git checkout %%a | ||
cd .. | ||
cd podpac | ||
cd dist | ||
echo "Updating CONDA ENVIRONMENT" | ||
conda env update -f windows_conda_environment.yml | ||
cd .. | ||
cd .. | ||
|
||
|
||
|
Large diffs are not rendered by default.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -25,5 +25,5 @@ dependencies: | |
- jupyterlab | ||
- ipyleaflet | ||
- ipympl | ||
prefix: D:\podpac-1.3.0\miniconda\envs\podpac | ||
- sat-search | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,62 @@ | ||
# Interpolation | ||
|
||
## Description | ||
|
||
PODPAC allows users to specify various different interpolation schemes for nodes with | ||
increased granularity, and even lets users write their own interpolators. | ||
|
||
Relevant example notebooks include: | ||
* [Advanced Interpolation](https://github.com/creare-com/podpac-examples/blob/master/notebooks/4-advanced/interpolation.ipynb) | ||
* [Basic Interpolation](https://github.com/creare-com/podpac-examples/blob/master/notebooks/2-combining-data/automatic-interpolation-and-regridding.ipynb) | ||
* [Drought Monitor Data Access Harmonization Processing](https://github.com/creare-com/podpac-examples/blob/master/notebooks/examples/drought-monitor/03-data-access-harmonization-processing.ipynb) | ||
|
||
## Examples | ||
Consider a `DataSource` with `lat`, `lon`, `time` coordinates that we will instantiate as: | ||
`node = DataSource(..., interpolation=interpolation)` | ||
|
||
`interpolation` can be specified ... | ||
|
||
### ...as a string | ||
|
||
`interpolation='nearest'` | ||
* **Descripition**: All dimensions are interpolated using nearest neighbor interpolation. This is the default, but available options can be found here: `podpac.core.interpolation.interpolation.INTERPOLATION_METHODS` . | ||
* **Details**: PODPAC will automatically select appropriate interpolators based on the source coordinates and eval coordinates. Default interpolator orders can be found in `podpac.core.interpolation.interpolation.INTERPOLATION_METHODS_DICT` | ||
|
||
### ...as a dictionary | ||
|
||
```python | ||
interpolation = { | ||
'method': 'nearest', | ||
'params': { # Optional. Available parameters depend on the particular interpolator | ||
'spatial_tolerance': 1.1, | ||
'time_tolerance': np.timedelta64(1, 'D') | ||
}, | ||
'interpolators': [ScipyGrid, NearestNeighbor] # Optional. Available options are in podpac.core.interpolation.interpolation.INTERPOLATORS | ||
} | ||
``` | ||
* **Descripition**: All dimensions are interpolated using nearest neighbor interpolation, and the type of interpolators are tried in the order specified. For applicable interpolators, the specified parameters will be used. | ||
* **Details**: PODPAC loops through the `interpolators` list, checking if the interpolator is able to interpolate between the evaluated and source coordinates. The first capable interpolator available will be used. | ||
|
||
### ...as a list | ||
|
||
```python | ||
interpolation = [ | ||
{ | ||
'method': 'bilinear', | ||
'dims': ['lat', 'lon'] | ||
}, | ||
{ | ||
'method': 'nearest', | ||
'dims': ['time'] | ||
} | ||
] | ||
``` | ||
|
||
* **Descripition**: The dimensions listed in the `'dims'` list will used the specified method. These dictionaries can also specify the same field shown in the previous section. | ||
* **Details**: PODPAC loops through the `interpolation` list, using the settings specified for each dimension independently. | ||
|
||
## Notes and Caveats | ||
While the API is well developed, all conceivable functionality is not. For example, while we can interpolate gridded data to point data, point data to grid data interpolation is not as well supported, and there may be errors or unexpected results. Advanced users can develop their own interpolators, but this is not currently well-documented. | ||
|
||
**Gotcha**: Parameters for a specific interpolator may silently be ignored if a different interpolator is automatically selected. | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,29 @@ | ||
# Wrapping Datasets | ||
|
||
Wrapping a new dataset is challenging because you have to understand all of the quirks of the new dataset and deal with the quirks of PODPAC as well. This reference is meant to record a few rules of thumb when wrapping new datasets to help you deal with the latter. | ||
|
||
## Rules | ||
1. When evaluating a node with a set of coordinates: | ||
1. The evaluation coordinates must include ALL of the dimensions present in the source dataset | ||
1. The evaluation coordinates MAY contain additional dimensions NOT present in the source dataset, and the source may ignore these | ||
2. When returning data from a data source node: | ||
1. The ORDER of the evaluation coordinates MUST be preserved (see `UnitsDataArray.part_transpose`) | ||
1. Any multi-channel data must be returned using the `output` dimension which is ALWAYS the LAST dimension | ||
3. Nodes should be **lightweight** to instantiate and users should expect *fail on eval*. Easy checks should be performed on initialization, but anything expensive should be delayed. | ||
|
||
## Guide | ||
In theory, to wrap a new `DataSource`: | ||
1. Create a new class that inherits from `podpac.core.data.DataSource` or a derived class (see the `podpac.core.data` module for generic data readers). | ||
2. Implement a method for opening/accessing the data, or use an existing generic data node and hard-code certain attributes | ||
3. Implement the `get_coordinates(self)` method | ||
4. Implement the `get_data(self, coordinates, coordinates_index)` method | ||
1. `coordinates` is a `podpac.Coordinates` object and it's in the same coordinate system as the data source (i.e. a subset of what comes out of `get_coordinates`) | ||
2. `coordinates_index` is a list (or tuple?) of slices or boolean arrays or index arrays to indexes into the output of `get_coordinates()` to produce `coordinates` that come into this function. | ||
|
||
In practice, the real trick is implementing a compositor to put multiple tiles together to look like a single `DataSource`. We tend to use the `podpac.compositor.OrderedCompositor` node for this task, but it does not handle interpolation between tiles. Instead, see the `podpac.core.compositor.tile_compositor` module. | ||
|
||
When using compositors, it is prefered the that `sources` attribute is populated at instantiation, but on-the-fly (i.e. at eval) population of sources is also acceptible and sometimes necessary for certain datasources. | ||
|
||
For examples, check the `podpac.datalib` module. | ||
|
||
Happy wrapping! |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.