Skip to content

Commit

Permalink
Various fixes for documentation (#4250)
Browse files Browse the repository at this point in the history
  • Loading branch information
philippjfr committed Mar 3, 2020
1 parent 36a53e4 commit 81108e9
Show file tree
Hide file tree
Showing 5 changed files with 75 additions and 78 deletions.
3 changes: 1 addition & 2 deletions examples/user_guide/03-Applying_Customizations.ipynb
Expand Up @@ -724,8 +724,7 @@
"\n",
"##### ``norm`` options:\n",
"\n",
"``norm`` options are a special type of plot option that are applied orthogonally to the above two types, to control normalization. Normalization refers to adjusting the properties of one plot relative to those of another. For instance, two images normalized together would appear with relative brightness levels, with the brightest image using the full range black to white, while the other image is scaled proportionally. Two images normalized independently would both cover the full range from black to white. Similarly, two axis ranges normalized together will expand to fit the largest range of either axis, while those normalized separately would cover different ranges. For listing available options, see the output of ``hv.help``.\n",
"\n",
"``norm`` options are a special type of plot option that are applied orthogonally to the above two types, to control normalization. Normalization refers to adjusting the properties of one plot relative to those of another. For instance, two images normalized together would appear with relative brightness levels, with the brightest image using the full range black to white, while the other image is scaled proportionally. Two images normalized independently would both cover the full range from black to white. Similarly, two axis ranges normalized together are effectively linked and will expand to fit the largest range of either axis, while those normalized separately would cover different ranges. For listing available options, see the output of ``hv.help``.\n",
"\n",
"You can preserve the semantic distinction between these types of option in an augmented form of the [Literal syntax](#Literal-syntax) as follows:"
]
Expand Down
25 changes: 22 additions & 3 deletions examples/user_guide/05-Dimensioned_Containers.ipynb
Expand Up @@ -22,8 +22,11 @@
"source": [
"import numpy as np\n",
"import holoviews as hv\n",
"\n",
"from holoviews import opts\n",
"hv.notebook_extension('bokeh')\n",
"\n",
"hv.extension('bokeh')\n",
"\n",
"opts.defaults(opts.Curve(line_width=1))"
]
},
Expand Down Expand Up @@ -247,7 +250,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Since this simply represents a multi-dimensional parameter space we can collapse this datastructure into a single table containing columns for each of the dimensions we have declared:"
"Since this simply represents a multi-dimensional parameter space we can collapse this datastructure into a single `Dataset` containing columns for each of the dimensions we have declared:"
]
},
{
Expand All @@ -256,10 +259,17 @@
"metadata": {},
"outputs": [],
"source": [
"ds = hv.Dataset(grid.table())\n",
"ds = grid.collapse()\n",
"ds.data.head()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"A ``Dataset`` is an element type which does not itself have a visual representation and may contain any type of data supported by HoloViews interfaces."
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down Expand Up @@ -330,6 +340,15 @@
"source": [
"As you can see the 'New dimension' was added at position zero."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Onwards\n",
"\n",
"As we have seen Dimensioned containers make it easy to explore multi-dimensional datasets by mapping the dimensions onto visual layers (`NdOverlay`), 2D grid axes (`GridSpace`), widgets (`HoloMap`) or an arbitrary layout (`NdLayout`). In the [next section](06-Dimensioned_Containers.ipynb) we will discover how to construct composite objects of heterogeneous data."
]
}
],
"metadata": {
Expand Down
25 changes: 4 additions & 21 deletions examples/user_guide/06-Building_Composite_Objects.ipynb
Expand Up @@ -86,22 +86,6 @@
"layout"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The structure of this object can be seen using ``print()``:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(layout)"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down Expand Up @@ -302,7 +286,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"At this point, we have reached the end of the HoloViews objects; below this object is only the raw data as a Numpy array:"
"At this point, we have reached the end of the HoloViews objects; below this object is only the raw data stored in one of the supported formats (e.g. NumPy arrays or Pandas DataFrames):"
]
},
{
Expand Down Expand Up @@ -398,8 +382,7 @@
"metadata": {},
"outputs": [],
"source": [
"layout.Parameters.Sines.select(Amplitude=0.5,Power=1.0, \n",
" Frequency=1.0).Phases.Sines.select(Phase=0.0)"
"layout.Parameters.Sines.select(Amplitude=0.5,Power=1.0, Frequency=1.0, Phase=0.0).Phases.Sines"
]
},
{
Expand All @@ -415,9 +398,9 @@
"source": [
"As you can see, HoloViews lets you compose objects of heterogenous types, and objects covering many different numerical or other dimensions, laying them out spatially, temporally, or overlaid. The resulting data structures are complex, but they are composed of simple elements with well-defined interactions, making it feasible to express nearly any relationship that will characterize your data. In practice, you will probably not need this many levels, but given this complete example, you should be able to construct an appropriate hierarchy for whatever type of data that you want to represent or visualize. \n",
"\n",
"As emphasized above, it is not possible to combine these objects in other orderings. Of course, any ``Element`` can be substituted for any other, which doesn't change the structure. But you cannot e.g. display an ``Overlay`` or ``HoloMap`` of ``Layout`` objects. Confusingly, the objects may *act* as if you have these arrangements. For instance, a ``Layout`` of ``HoloMap`` objects will be animated, like ``HoloMap`` objects, but only because of the extra dimension(s) provided by the enclosed ``HoloMap`` objects, not because the ``Layout`` itself has data along those dimensions. Similarly, you cannot have a ``Layout`` of ``Layout`` objects, even though it looks like you can. E.g. the ``+`` operator on two ``Layout`` objects will not create a ``Layout`` of ``Layout`` objects; it just creates a new ``Layout`` object containing the data from both of the other objects. Similarly for the ``Overlay`` of ``Overlay`` objects using ``*``; only a single combined ``Overlay`` is returned.\n",
"As emphasized above, it is not recommended to combine these objects in other orderings. Of course, any ``Element`` can be substituted for any other, which doesn't change the structure. But you should not e.g. display an ``Overlay`` of ``Layout`` objects. The display system will generally attempt to figure out the correct arrangement and warn you to call the `.collate` method to reorganize the objects in the recommended format.\n",
"\n",
"If you are confused about how all of this works in practice, you can use the examples in the [Dimensioned Containers](./05-Dimensioned_Containers.ipynb) user guide."
"Another important thing to observe is that ``Layout`` and ``Overlay`` types may not be nested, e.g. using the ``+`` operator on two `Layout` objects will not create a nested `Layout` instead combining the contents of the two objects. The same applies to the ``*`` operator and combining of `Overlay` objects. "
]
}
],
Expand Down
48 changes: 48 additions & 0 deletions holoviews/core/ndmapping.py
Expand Up @@ -853,6 +853,54 @@ def clone(self, data=None, shared_data=True, new_type=None, link=True,
if k not in pos_args})


def collapse(self, dimensions=None, function=None, spreadfn=None, **kwargs):
"""Concatenates and aggregates along supplied dimensions
Useful to collapse stacks of objects into a single object,
e.g. to average a stack of Images or Curves.
Args:
dimensions: Dimension(s) to collapse
Defaults to all key dimensions
function: Aggregation function to apply, e.g. numpy.mean
spreadfn: Secondary reduction to compute value spread
Useful for computing a confidence interval, spread, or
standard deviation.
**kwargs: Keyword arguments passed to the aggregation function
Returns:
Returns the collapsed element or HoloMap of collapsed
elements
"""
from .data import concat
if not dimensions:
dimensions = self.kdims
if not isinstance(dimensions, list): dimensions = [dimensions]
if self.ndims > 1 and len(dimensions) != self.ndims:
groups = self.groupby([dim for dim in self.kdims
if dim not in dimensions])

elif all(d in self.kdims for d in dimensions):
groups = UniformNdMapping([(0, self)], ['tmp'])
else:
raise KeyError("Supplied dimensions not found.")

collapsed = groups.clone(shared_data=False)
for key, group in groups.items():
if hasattr(group.values()[-1], 'interface'):
group_data = concat(group)
if function:
agg = group_data.aggregate(group.last.kdims, function, spreadfn, **kwargs)
group_data = group.type(agg)
else:
group_data = [el.data for el in group]
args = (group_data, function, group.last.kdims)
data = group.type.collapse_data(*args, **kwargs)
group_data = group.last.clone(data)
collapsed[key] = group_data
return collapsed if self.ndims-len(dimensions) else collapsed.last


def dframe(self, dimensions=None, multi_index=False):
"""Convert dimension values to DataFrame.
Expand Down
52 changes: 0 additions & 52 deletions holoviews/core/spaces.py
Expand Up @@ -349,53 +349,6 @@ def collate(self, merge_type=None, drop=[], drop_constant=False):
drop_constant=drop_constant)()


def collapse(self, dimensions=None, function=None, spreadfn=None, **kwargs):
"""Concatenates and aggregates along supplied dimensions
Useful to collapse stacks of objects into a single object,
e.g. to average a stack of Images or Curves.
Args:
dimensions: Dimension(s) to collapse
Defaults to all key dimensions
function: Aggregation function to apply, e.g. numpy.mean
spreadfn: Secondary reduction to compute value spread
Useful for computing a confidence interval, spread, or
standard deviation.
**kwargs: Keyword arguments passed to the aggregation function
Returns:
Returns the collapsed element or HoloMap of collapsed
elements
"""
from .data import concat
if not dimensions:
dimensions = self.kdims
if not isinstance(dimensions, list): dimensions = [dimensions]
if self.ndims > 1 and len(dimensions) != self.ndims:
groups = self.groupby([dim for dim in self.kdims
if dim not in dimensions])
elif all(d in self.kdims for d in dimensions):
groups = HoloMap([(0, self)])
else:
raise KeyError("Supplied dimensions not found.")

collapsed = groups.clone(shared_data=False)
for key, group in groups.items():
if hasattr(group.last, 'interface'):
group_data = concat(group)
if function:
agg = group_data.aggregate(group.last.kdims, function, spreadfn, **kwargs)
group_data = group.type(agg)
else:
group_data = [el.data for el in group]
args = (group_data, function, group.last.kdims)
data = group.type.collapse_data(*args, **kwargs)
group_data = group.last.clone(data)
collapsed[key] = group_data
return collapsed if self.ndims-len(dimensions) else collapsed.last


def sample(self, samples=[], bounds=None, **sample_values):
"""Samples element values at supplied coordinates.
Expand Down Expand Up @@ -931,11 +884,6 @@ def __init__(self, callback, initial_items=None, streams=None, **params):
elif not isinstance(callback, Callable):
callback = Callable(callback)

if 'sampled' in params:
self.param.warning('DynamicMap sampled parameter is deprecated '
'and no longer needs to be specified.')
del params['sampled']

valid, invalid = Stream._process_streams(streams)
if invalid:
msg = ('The supplied streams list contains objects that '
Expand Down

0 comments on commit 81108e9

Please sign in to comment.