Skip to content

Commit

Permalink
MAINT: Updating changelog, fixing documentation, updating conda envir…
Browse files Browse the repository at this point in the history
…onment files, bumping version, and additing file to update podpac from zip-file-folder install.
  • Loading branch information
mpu-creare committed Jul 14, 2020
1 parent 9c06de1 commit 774d7a3
Show file tree
Hide file tree
Showing 8 changed files with 208 additions and 117 deletions.
33 changes: 33 additions & 0 deletions CHANGELOG.md
@@ -1,5 +1,38 @@
# Changelog

## 2.2.0
### Introduction

Wrapping Landsat8, Sentinel2, and MODIS data and improving interpolation.

### Features
* Added `datalib.satutils` which wraps Landsat8 and Sentinel2 data
* Added `datalib.modis_pds` which wraps MODIS products ["MCD43A4.006", "MOD09GA.006", "MYD09GA.006", "MOD09GQ.006", "MYD09GQ.006"]
* Added settings['AWS_REQUESTER_PAYS'] and `authentication.S3Mixing.aws_requester_pays` attribute to support Sentinel2 data
* Added `issubset` method to Coordinates which allows users to test if a coordinate is a subset of another one
* Added environmental variables in Lambda function deployment allowing users to specify the location of additional
dependencies (`FUNCTION_DEPENDENCIES_KEY`) and settings (`SETTINGS`). This was in support the WMS service.
* Intake nodes can now filter inputs by additional data columns for .csv files / pandas dataframes by using the pandas
`query` method.
* Added documentation on `Interpolation` and `Wrapping Datasets`

### Bug Fixes
* Added `dims` attributes to `Compositor` nodes which indicates the dimensions that sources are expected to have. This
fixes a bug where `Nodes` throw and error if Coordinates contain extra dimensions when the `Compositor` sources are missing
those dimensions.
* `COSMOSStations` will no longer fail for sites with no data or one data point. These sites are now automatically filtered.
* Fixed `core.data.file_source` closing files prematurely due to using context managers
* Fixed heterogenous interpolation (where lat/lon uses a different interpolator than time, for example)
* `datalib.TerrainTiles` now accesses S3 anonymously by default. Interpolation specified at the compositor level are
also now passed down to the sources.

### Breaking changes
* Fixed `core.algorithm.signal.py` and in the process removed `SpatialConvolution` and `TemporalConvolutions`. Users now
have to label the dimensions of the kernel -- which prevents results from being modified if the eval coordinates are
transposed. This was a major bug in the `Convolution` node, and the new change obviates the need for the removed Nodes,
but it may break some pipelines.


## 2.1.0
### Introduction

Expand Down
24 changes: 24 additions & 0 deletions dist/local_Windows_install/update_podpac.bat
@@ -0,0 +1,24 @@
@echo off
call bin\set_local_conda_path.bat
call bin\fix_hardcoded_absolute_paths.bat
call bin\activate_podpac_conda_env.bat

cd podpac
echo "Updating PODPAC"
git fetch
for /f %%a in ('git describe --tags --abbrev^=0 origin/master') do git checkout %%a
cd ..
echo "Updating PODPAC EXAMPLES"
cd podpac-examples
git fetch
for /f %%a in ('git describe --tags --abbrev^=0 origin/master') do git checkout %%a
cd ..
cd podpac
cd dist
echo "Updating CONDA ENVIRONMENT"
conda env update -f windows_conda_environment.yml
cd ..
cd ..



244 changes: 139 additions & 105 deletions dist/windows_conda_environment.json

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion dist/windows_conda_environment.yml
Expand Up @@ -25,5 +25,5 @@ dependencies:
- jupyterlab
- ipyleaflet
- ipympl
prefix: D:\podpac-1.3.0\miniconda\envs\podpac
- sat-search

2 changes: 1 addition & 1 deletion doc/source/design.rst
Expand Up @@ -17,7 +17,7 @@ Node
**Nodes** describe the components of your analysis.
These include data sources, combined data sources (**Compositors**), algorithms, and the assembly of data sources.
Nodes are assembled into :ref:`design_pipelines`, which can be output to a text file or pushed to the cloud
with minimal configuration. **Nodes** are design to **FAIL ON EVAL**, not fail when instantiated. This is order to defer
with minimal configuration. **Nodes** are designed to **FAIL ON EVAL**, not fail when instantiated. This is in order to defer
expensive operations till the user really wants them.

.. image:: /_static/img/node.png
Expand Down
6 changes: 3 additions & 3 deletions doc/source/interpolation.md
Expand Up @@ -6,9 +6,9 @@ PODPAC allows users to specify various different interpolation schemes for nodes
increased granularity, and even lets users write their own interpolators.

Relevant example notebooks include:
* https://github.com/creare-com/podpac-examples/blob/master/notebooks/4-advanced/interpolation.ipynb
* https://github.com/creare-com/podpac-examples/blob/master/notebooks/2-combining-data/automatic-interpolation-and-regridding.ipynb
* https://github.com/creare-com/podpac-examples/blob/master/notebooks/examples/drought-monitor/03-data-access-harmonization-processing.ipynb
* [Advanced Interpolation](https://github.com/creare-com/podpac-examples/blob/master/notebooks/4-advanced/interpolation.ipynb)
* [Basic Interpolation](https://github.com/creare-com/podpac-examples/blob/master/notebooks/2-combining-data/automatic-interpolation-and-regridding.ipynb)
* [Drought Monitor Data Access Harmonization Processing](https://github.com/creare-com/podpac-examples/blob/master/notebooks/examples/drought-monitor/03-data-access-harmonization-processing.ipynb)

## Examples
Consider a `DataSource` with `lat`, `lon`, `time` coordinates that we will instantiate as:
Expand Down
12 changes: 6 additions & 6 deletions doc/source/wrapping-datasets.md
Expand Up @@ -4,11 +4,11 @@ Wrapping a new dataset is challenging because you have to understand all of the

## Rules
1. When evaluating a node with a set of coordinates:
a. The evaluation coordinates must include ALL of the dimensions present in the source dataset
b. The evaluation coordinates MAY contain additional dimensions NOT present in the source dataset, and the source may ignore these
1. The evaluation coordinates must include ALL of the dimensions present in the source dataset
1. The evaluation coordinates MAY contain additional dimensions NOT present in the source dataset, and the source may ignore these
2. When returning data from a data source node:
a. The ORDER of the evaluation coordinates MUST be preserved (see `UnitsDataArray.part_transpose`)
b. Any multi-channel data must be returned using the `output` dimension which is ALWAYS the LAST dimension
1. The ORDER of the evaluation coordinates MUST be preserved (see `UnitsDataArray.part_transpose`)
1. Any multi-channel data must be returned using the `output` dimension which is ALWAYS the LAST dimension
3. Nodes should be **lightweight** to instantiate and users should expect *fail on eval*. Easy checks should be performed on initialization, but anything expensive should be delayed.

## Guide
Expand All @@ -17,8 +17,8 @@ In theory, to wrap a new `DataSource`:
2. Implement a method for opening/accessing the data, or use an existing generic data node and hard-code certain attributes
3. Implement the `get_coordinates(self)` method
4. Implement the `get_data(self, coordinates, coordinates_index)` method
a. `coordinates` is a `podpac.Coordinates` object and it's in the same coordinate system as the data source (i.e. a subset of what comes out of `get_coordinates`)
b. `coordinates_index` is a list (or tuple?) of slices or boolean arrays or index arrays to indexes into the output of `get_coordinates()` to produce `coordinates` that come into this function.
1. `coordinates` is a `podpac.Coordinates` object and it's in the same coordinate system as the data source (i.e. a subset of what comes out of `get_coordinates`)
2. `coordinates_index` is a list (or tuple?) of slices or boolean arrays or index arrays to indexes into the output of `get_coordinates()` to produce `coordinates` that come into this function.

In practice, the real trick is implementing a compositor to put multiple tiles together to look like a single `DataSource`. We tend to use the `podpac.compositor.OrderedCompositor` node for this task, but it does not handle interpolation between tiles. Instead, see the `podpac.core.compositor.tile_compositor` module.

Expand Down
2 changes: 1 addition & 1 deletion podpac/version.py
Expand Up @@ -17,7 +17,7 @@
##############
MAJOR = 2
MINOR = 2
HOTFIX = 0
HOTFIX = 1
##############


Expand Down

0 comments on commit 774d7a3

Please sign in to comment.