Skip to content

Commit

Permalink
Merge branch 'master' into timeseries-wizard
Browse files Browse the repository at this point in the history
  • Loading branch information
srbdev committed Jan 26, 2016
2 parents 21e3dc4 + 0e21f03 commit 4e7b1ec
Show file tree
Hide file tree
Showing 21 changed files with 334 additions and 258 deletions.
52 changes: 52 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,3 +4,55 @@ This is Slycat - a web-based ensemble analysis and visualization platform, creat

For installation, tutorials, and developer documentation, go to http://slycat.readthedocs.org

# [Slycat-data](https://github.com/sandialabs/slycat-data)
A directory of sample data that can be used by slycat

****

# Ensemble analysis and Visualization

## Multiple Levels of Abstraction
* Ensemble summaries (correlations or similarities)

![alt tag](https://github.com/sandialabs/slycat/blob/master/Sample-Images/ParameterSpaceExploration/LevelsOfAbstraction.png)

* Individual runs relative to the group (distributions or behaviors)

![alt tag](https://github.com/sandialabs/slycat/blob/master/Sample-Images/ParameterSpaceExploration/LevelsOfAbstraction2.png)

* Run-specific data (numeric values, images, or videos)

![alt tag](https://github.com/sandialabs/slycat/blob/master/Sample-Images/ParameterSpaceExploration/LevelsOfAbstraction3.png)

##Sensitivity Analysis
1. Model Understanding
2. Model Validation
3. Model Simplification


![alt tag](https://github.com/sandialabs/slycat/blob/master/Sample-Images/ParameterSpaceExploration/LevelsOfAbstraction4.png)
![alt tag](https://github.com/sandialabs/slycat/blob/master/Sample-Images/ParameterSpaceExploration/LevelsOfAbstraction5.png)

## Parameter Space Exploration
1. Results Clustering
2. Design Optimization
3. Model Tuning

![alt tag](https://github.com/sandialabs/slycat/blob/master/Sample-Images/ParameterSpaceExploration/ParameterSpaceExploration1.png)
![alt tag](https://github.com/sandialabs/slycat/blob/master/Sample-Images/ParameterSpaceExploration/ParameterSpaceExploration2.png)
![alt tag](https://github.com/sandialabs/slycat/blob/master/Sample-Images/ParameterSpaceExploration/ParameterSpaceExploration3.png)


## Anomaly Detection
1. Unique Features
2. Bugs


![alt tag](https://github.com/sandialabs/slycat/blob/master/Sample-Images/Anomaly%20detection/AnomalyDetection1.png)
![alt tag](https://github.com/sandialabs/slycat/blob/master/Sample-Images/Anomaly%20detection/AnomalyDetection2.png)
![alt tag](https://github.com/sandialabs/slycat/blob/master/Sample-Images/Anomaly%20detection/AnomalyDetection3.png)
![alt tag](https://github.com/sandialabs/slycat/blob/master/Sample-Images/Anomaly%20detection/AnomalyDetection4.png)
![alt tag](https://github.com/sandialabs/slycat/blob/master/Sample-Images/Anomaly%20detection/AnomalyDetection5.png)
![alt tag](https://github.com/sandialabs/slycat/blob/master/Sample-Images/Anomaly%20detection/AnomalyDetection6.png)
![alt tag](https://github.com/sandialabs/slycat/blob/master/Sample-Images/Anomaly%20detection/AnomalyDetection7.png)
![alt tag](https://github.com/sandialabs/slycat/blob/master/Sample-Images/Anomaly%20detection/AnomalyDetection8.png)
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
220 changes: 110 additions & 110 deletions packages/slycat/web/server/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -121,71 +121,71 @@ def get_model_arrayset_metadata(database, model, aid, arrays=None, statistics=No

# Handle legacy behavior.
if arrays is None and statistics is None and unique is None:
#with slycat.web.server.hdf5.lock:
with slycat.web.server.hdf5.open(model["artifact:%s" % aid], "r+") as file:
with slycat.web.server.hdf5.lock:
with slycat.web.server.hdf5.open(model["artifact:%s" % aid], "r+") as file:
hdf5_arrayset = slycat.hdf5.ArraySet(file)
results = []
for array in sorted(hdf5_arrayset.keys()):
hdf5_array = hdf5_arrayset[array]
results.append({
"array": int(array),
"index" : int(array),
"dimensions" : hdf5_array.dimensions,
"attributes" : hdf5_array.attributes,
"shape": tuple([dimension["end"] - dimension["begin"] for dimension in hdf5_array.dimensions]),
})
return results

with slycat.web.server.hdf5.lock:
with slycat.web.server.hdf5.open(model["artifact:%s" % aid], "r+") as file: # We have to open the file with writing enabled in case the statistics cache needs to be updated.
hdf5_arrayset = slycat.hdf5.ArraySet(file)
results = []
for array in sorted(hdf5_arrayset.keys()):
hdf5_array = hdf5_arrayset[array]
results.append({
"array": int(array),
"index" : int(array),
"dimensions" : hdf5_array.dimensions,
"attributes" : hdf5_array.attributes,
"shape": tuple([dimension["end"] - dimension["begin"] for dimension in hdf5_array.dimensions]),
})
return results
results = {}
if arrays is not None:
results["arrays"] = []
for array in slycat.hyperchunks.arrays(arrays, hdf5_arrayset.array_count()):
hdf5_array = hdf5_arrayset[array.index]
results["arrays"].append({
"index" : array.index,
"dimensions" : hdf5_array.dimensions,
"attributes" : hdf5_array.attributes,
"shape": tuple([dimension["end"] - dimension["begin"] for dimension in hdf5_array.dimensions]),
})
if statistics is not None:
results["statistics"] = []
for array in slycat.hyperchunks.arrays(statistics, hdf5_arrayset.array_count()):
hdf5_array = hdf5_arrayset[array.index]
for attribute in array.attributes(len(hdf5_array.attributes)):
statistics = {}
statistics["array"] = array.index
if isinstance(attribute.expression, slycat.hyperchunks.grammar.AttributeIndex):
statistics["attribute"] = attribute.expression.index
statistics.update(hdf5_array.get_statistics(attribute.expression.index))
else:
values = evaluate(hdf5_array, attribute.expression, "statistics")
statistics["min"] = values.min()
statistics["max"] = values.max()
statistics["unique"] = len(numpy.unique(values))
results["statistics"].append(statistics)

if unique is not None:
results["unique"] = []
for array in slycat.hyperchunks.arrays(unique, hdf5_arrayset.array_count()):
hdf5_array = hdf5_arrayset[array.index]
for attribute in array.attributes(len(hdf5_array.attributes)):
unique = {}
unique["array"] = array.index
unique["values"] = []
if isinstance(attribute.expression, slycat.hyperchunks.grammar.AttributeIndex):
for hyperslice in attribute.hyperslices():
unique["attribute"] = attribute.expression.index
unique["values"].append(hdf5_array.get_unique(attribute.expression.index, hyperslice)["values"])
else:
values = evaluate(hdf5_array, attribute.expression, "uniques")
for hyperslice in attribute.hyperslices():
unique["values"].append(numpy.unique(values)[hyperslice])
results["unique"].append(unique)

#with slycat.web.server.hdf5.lock:
with slycat.web.server.hdf5.open(model["artifact:%s" % aid], "r+") as file: # We have to open the file with writing enabled in case the statistics cache needs to be updated.
hdf5_arrayset = slycat.hdf5.ArraySet(file)
results = {}
if arrays is not None:
results["arrays"] = []
for array in slycat.hyperchunks.arrays(arrays, hdf5_arrayset.array_count()):
hdf5_array = hdf5_arrayset[array.index]
results["arrays"].append({
"index" : array.index,
"dimensions" : hdf5_array.dimensions,
"attributes" : hdf5_array.attributes,
"shape": tuple([dimension["end"] - dimension["begin"] for dimension in hdf5_array.dimensions]),
})
if statistics is not None:
results["statistics"] = []
for array in slycat.hyperchunks.arrays(statistics, hdf5_arrayset.array_count()):
hdf5_array = hdf5_arrayset[array.index]
for attribute in array.attributes(len(hdf5_array.attributes)):
statistics = {}
statistics["array"] = array.index
if isinstance(attribute.expression, slycat.hyperchunks.grammar.AttributeIndex):
statistics["attribute"] = attribute.expression.index
statistics.update(hdf5_array.get_statistics(attribute.expression.index))
else:
values = evaluate(hdf5_array, attribute.expression, "statistics")
statistics["min"] = values.min()
statistics["max"] = values.max()
statistics["unique"] = len(numpy.unique(values))
results["statistics"].append(statistics)

if unique is not None:
results["unique"] = []
for array in slycat.hyperchunks.arrays(unique, hdf5_arrayset.array_count()):
hdf5_array = hdf5_arrayset[array.index]
for attribute in array.attributes(len(hdf5_array.attributes)):
unique = {}
unique["array"] = array.index
unique["values"] = []
if isinstance(attribute.expression, slycat.hyperchunks.grammar.AttributeIndex):
for hyperslice in attribute.hyperslices():
unique["attribute"] = attribute.expression.index
unique["values"].append(hdf5_array.get_unique(attribute.expression.index, hyperslice)["values"])
else:
values = evaluate(hdf5_array, attribute.expression, "uniques")
for hyperslice in attribute.hyperslices():
unique["values"].append(numpy.unique(values)[hyperslice])
results["unique"].append(unique)

return results
return results

def get_model_arrayset_data(database, model, aid, hyperchunks):
"""Read data from an arrayset artifact.
Expand All @@ -210,22 +210,22 @@ def get_model_arrayset_data(database, model, aid, hyperchunks):
if isinstance(hyperchunks, basestring):
hyperchunks = slycat.hyperchunks.parse(hyperchunks)

# with slycat.web.server.hdf5.lock:
with slycat.web.server.hdf5.open(model["artifact:%s" % aid], "r+") as file:
hdf5_arrayset = slycat.hdf5.ArraySet(file)
for array in slycat.hyperchunks.arrays(hyperchunks, hdf5_arrayset.array_count()):
hdf5_array = hdf5_arrayset[array.index]
with slycat.web.server.hdf5.lock:
with slycat.web.server.hdf5.open(model["artifact:%s" % aid], "r+") as file:
hdf5_arrayset = slycat.hdf5.ArraySet(file)
for array in slycat.hyperchunks.arrays(hyperchunks, hdf5_arrayset.array_count()):
hdf5_array = hdf5_arrayset[array.index]

if array.order is not None:
order = evaluate(hdf5_array, array.order, "order")
if array.order is not None:
order = evaluate(hdf5_array, array.order, "order")

for attribute in array.attributes(len(hdf5_array.attributes)):
values = evaluate(hdf5_array, attribute.expression, "attribute")
for hyperslice in attribute.hyperslices():
if array.order is not None:
yield values[order][hyperslice]
else:
yield values[hyperslice]
for attribute in array.attributes(len(hdf5_array.attributes)):
values = evaluate(hdf5_array, attribute.expression, "attribute")
for hyperslice in attribute.hyperslices():
if array.order is not None:
yield values[order][hyperslice]
else:
yield values[hyperslice]

def get_model_parameter(database, model, aid):
key = "artifact:%s" % aid
Expand All @@ -238,22 +238,22 @@ def put_model_arrayset(database, model, aid, input=False):
"""Start a new model array set artifact."""
slycat.web.server.update_model(database, model, message="Starting array set %s." % (aid))
storage = uuid.uuid4().hex
#with slycat.web.server.hdf5.lock:
with slycat.web.server.hdf5.create(storage) as file:
arrayset = slycat.hdf5.start_arrayset(file)
database.save({"_id" : storage, "type" : "hdf5"})
model["artifact:%s" % aid] = storage
model["artifact-types"][aid] = "hdf5"
if input:
model["input-artifacts"] = list(set(model["input-artifacts"] + [aid]))
database.save(model)
with slycat.web.server.hdf5.lock:
with slycat.web.server.hdf5.create(storage) as file:
arrayset = slycat.hdf5.start_arrayset(file)
database.save({"_id" : storage, "type" : "hdf5"})
model["artifact:%s" % aid] = storage
model["artifact-types"][aid] = "hdf5"
if input:
model["input-artifacts"] = list(set(model["input-artifacts"] + [aid]))
database.save(model)

def put_model_array(database, model, aid, array_index, attributes, dimensions):
slycat.web.server.update_model(database, model, message="Starting array set %s array %s." % (aid, array_index))
storage = model["artifact:%s" % aid]
#with slycat.web.server.hdf5.lock:
with slycat.web.server.hdf5.open(storage, "r+") as file:
slycat.hdf5.ArraySet(file).start_array(array_index, dimensions, attributes)
with slycat.web.server.hdf5.lock:
with slycat.web.server.hdf5.open(storage, "r+") as file:
slycat.hdf5.ArraySet(file).start_array(array_index, dimensions, attributes)

def put_model_arrayset_data(database, model, aid, hyperchunks, data):
"""Write data to an arrayset artifact.
Expand Down Expand Up @@ -281,23 +281,23 @@ def put_model_arrayset_data(database, model, aid, hyperchunks, data):

slycat.web.server.update_model(database, model, message="Storing data to array set %s." % (aid))

#with slycat.web.server.hdf5.lock:
with slycat.web.server.hdf5.open(model["artifact:%s" % aid], "r+") as file:
hdf5_arrayset = slycat.hdf5.ArraySet(file)
for array in slycat.hyperchunks.arrays(hyperchunks, hdf5_arrayset.array_count()):
hdf5_array = hdf5_arrayset[array.index]
for attribute in array.attributes(len(hdf5_array.attributes)):
if not isinstance(attribute.expression, slycat.hyperchunks.grammar.AttributeIndex):
slycat.email.send_error("slycat.web.server.__init__.py put_model_arrayset_data", "Cannot write to computed attribute.")
raise ValueError("Cannot write to computed attribute.")
stored_type = slycat.hdf5.dtype(hdf5_array.attributes[attribute.expression.index]["type"])
for hyperslice in attribute.hyperslices():
cherrypy.log.error("Writing to %s/%s/%s/%s" % (aid, array.index, attribute.expression.index, hyperslice))

data_hyperslice = next(data)
if isinstance(data_hyperslice, list):
data_hyperslice = numpy.array(data_hyperslice, dtype=stored_type)
hdf5_array.set_data(attribute.expression.index, hyperslice, data_hyperslice)
with slycat.web.server.hdf5.lock:
with slycat.web.server.hdf5.open(model["artifact:%s" % aid], "r+") as file:
hdf5_arrayset = slycat.hdf5.ArraySet(file)
for array in slycat.hyperchunks.arrays(hyperchunks, hdf5_arrayset.array_count()):
hdf5_array = hdf5_arrayset[array.index]
for attribute in array.attributes(len(hdf5_array.attributes)):
if not isinstance(attribute.expression, slycat.hyperchunks.grammar.AttributeIndex):
slycat.email.send_error("slycat.web.server.__init__.py put_model_arrayset_data", "Cannot write to computed attribute.")
raise ValueError("Cannot write to computed attribute.")
stored_type = slycat.hdf5.dtype(hdf5_array.attributes[attribute.expression.index]["type"])
for hyperslice in attribute.hyperslices():
cherrypy.log.error("Writing to %s/%s/%s/%s" % (aid, array.index, attribute.expression.index, hyperslice))

data_hyperslice = next(data)
if isinstance(data_hyperslice, list):
data_hyperslice = numpy.array(data_hyperslice, dtype=stored_type)
hdf5_array.set_data(attribute.expression.index, hyperslice, data_hyperslice)

def put_model_file(database, model, aid, value, content_type, input=False):
fid = database.write_file(model, content=value, content_type=content_type)
Expand Down Expand Up @@ -331,10 +331,10 @@ def put_model_inputs(database, model, source, deep_copy=False):
if deep_copy:
new_value = uuid.uuid4().hex
os.makedirs(os.path.dirname(slycat.web.server.hdf5.path(new_value)))
#with slycat.web.server.hdf5.lock:
shutil.copy(slycat.web.server.hdf5.path(original_value), slycat.web.server.hdf5.path(new_value))
model["artifact:%s" % aid] = new_value
database.save({"_id" : new_value, "type" : "hdf5"})
with slycat.web.server.hdf5.lock:
shutil.copy(slycat.web.server.hdf5.path(original_value), slycat.web.server.hdf5.path(new_value))
model["artifact:%s" % aid] = new_value
database.save({"_id" : new_value, "type" : "hdf5"})
else:
model["artifact:%s" % aid] = original_value
elif original_type == "file":
Expand Down

0 comments on commit 4e7b1ec

Please sign in to comment.