Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
64 commits
Select commit Hold shift + click to select a range
9f82c4a
fix ansible README for settings; comment out spurious log statements …
May 7, 2020
742de3c
fixed .gitignore
May 7, 2020
6e209c5
Changed GUI background pic
May 7, 2020
9f05db1
fix: clarify CONTRIBUTING.md It was not obvious where pull requests s…
May 7, 2020
2375523
docs: minor edits to .md files for the relative path
kingzevin May 7, 2020
8449176
initial ASL Map state feature attempt
ksatzke May 8, 2020
7a701f8
adding more tests
ksatzke May 8, 2020
dc08d6f
fix: ansible Makefile dependencies for management service, test utils…
May 12, 2020
cf59f36
Ansible installation - updating README.md and also removing unused se…
paarijaat May 12, 2020
0f5cda0
Ansible readme, minor edit to change mfn to knix
paarijaat May 12, 2020
7f89fdf
asl_ tests working except for Parameters and Map
ksatzke May 12, 2020
ce7892c
SDK and CLI fixes
manuelstein May 12, 2020
f3803e0
Improving deployment documentation
manuelstein May 12, 2020
75b11a5
Helm deployment instructions - adding push target and adapting README.md
manuelstein May 12, 2020
05e8fec
fix: sandbox agent sanity checks for queue service and frontend durin…
May 12, 2020
c2a25d3
changed DataLayerService's server type from TThreadPoolServer to TThr…
ruichuan May 13, 2020
d4788f8
Update README.md for Kubernetes installation
May 13, 2020
a421d51
Moving K8s-specific web UI information
manuelstein May 13, 2020
e0d21f5
added legally-requested ASL spec link
ruichuan May 12, 2020
36daaec
tests using 'Parameters' running successfully
ksatzke May 13, 2020
6988433
Map state branches executing, triggering of end state not working
ksatzke May 14, 2020
56c9398
branch terminal states publish correct results
ksatzke May 14, 2020
0d01101
fix: remove spurious comment from getWorkflowStatus()
May 14, 2020
45421e3
counter trigger working again, problems with I/O paths filtering
ksatzke May 15, 2020
491379f
fix: update sandbox status when shutting down due to a failure
May 15, 2020
2c5bfb7
Map state tests partially executing successfully
ksatzke May 15, 2020
250d454
fix: Queue service now has a fixed capacity for (each) queue (1000), …
May 15, 2020
3485b54
deployWorkflow [K8s] changes: undeploy if exists
manuelstein May 15, 2020
9c72483
changed QueueService's server type from TThreadPoolServer to TThreade…
ruichuan May 13, 2020
8168949
go back to ThreadPoolServer in QueueService and DataLayerService
May 15, 2020
5e4d4b3
memory efficiency: DataLayerService's server type changed from TThrea…
ruichuan May 16, 2020
9952a4f
modified the thread pool size in DataLayerService for higher concurrency
ruichuan May 16, 2020
5e5e5fb
refactor: management service returns more consistent workflow status
May 17, 2020
6c8eb39
update and cleanup ansible installation scripts and readme
May 17, 2020
c845e46
Merge branch 'master' into develop
May 18, 2020
44393a4
all but one test execute successfully
ksatzke May 18, 2020
541f1e1
all ASL Map state tests passing
ksatzke May 18, 2020
cf8f67b
fix GUI update/creation of SDK markdown files
manuelstein May 18, 2020
7392f9c
adding clean-up of object store data
ksatzke May 19, 2020
2573167
initial ASL Map state feature attempt
ksatzke May 8, 2020
dce7f25
adding more tests
ksatzke May 8, 2020
0a7b8bf
asl_ tests working except for Parameters and Map
ksatzke May 12, 2020
55f5b28
tests using 'Parameters' running successfully
ksatzke May 13, 2020
0aa2e47
Map state branches executing, triggering of end state not working
ksatzke May 14, 2020
8c43358
branch terminal states publish correct results
ksatzke May 14, 2020
5699e31
counter trigger working again, problems with I/O paths filtering
ksatzke May 15, 2020
38473e4
Map state tests partially executing successfully
ksatzke May 15, 2020
f06a9e7
all but one test execute successfully
ksatzke May 18, 2020
d83e0fb
all ASL Map state tests passing
ksatzke May 18, 2020
c57ffe6
adding clean-up of object store data
ksatzke May 19, 2020
a48b3af
Merge branch 'feature/ASL_Map_state' of https://github.com/knix-micro…
ksatzke May 19, 2020
45c7df1
convert logging statements from info() to debug()
May 19, 2020
35f1ac1
more logging statements conversion
May 19, 2020
bd9c9c8
fix: management service deleteWorkflow allows deletion of workflows w…
May 19, 2020
f6c33ba
fix: test import_error now deletes the failed workflow
May 19, 2020
60a9e4e
merge 'develop' into 'feature/ASL_Map_state'
May 19, 2020
b6a50f5
added missing ASL validation support for Map state in GUI
abeckn May 20, 2020
1b50061
fix: management Makefile uses current user id and group for thrift build
May 21, 2020
4b7c9bd
refactoring ASL Map state tests
ksatzke May 22, 2020
1048aac
refactoring ASL Map state tests
ksatzke May 22, 2020
a95c0a4
make test folder names consistent
ksatzke May 25, 2020
6648c80
after git rebase develop
ksatzke May 25, 2020
4a4a8e3
Merge branch 'feature/ASL_Map_state' into develop
May 26, 2020
fc00625
fix: return empty workflow list for new users, fixes #16
May 26, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 15 additions & 2 deletions FunctionWorker/python/FunctionWorker.py
Original file line number Diff line number Diff line change
Expand Up @@ -330,10 +330,22 @@ def _fork_and_handle_message(self, key, encapsulated_value):

# 3. Apply InputPath, if available
timestamp_map["t_start_inputpath"] = time.time() * 1000.0
#self._logger.debug("[FunctionWorker] Before Path/Parameters processing, input: " + str(type(raw_state_input)) + " : " + str(raw_state_input) + ", metadata: " + str(metadata) + " has_error: " + str(has_error))
if not has_error:
try:
function_input = self._state_utils.applyInputPath(raw_state_input)
#self._logger.debug("[FunctionWorker] User code input(After InputPath processing):" + str(type(function_input)) + ":" + str(function_input))
if "__state_action" not in metadata or (metadata["__state_action"] != "post_map_processing" and metadata["__state_action"] != "post_parallel_processing"):
#self._logger.debug("[FunctionWorker] User code input(Before InputPath processing):" + str(type(raw_state_input)) + ":" + str(raw_state_input))
function_input = self._state_utils.applyInputPath(raw_state_input)
#self._logger.debug("[FunctionWorker] User code input(Before applyParameter processing):" + str(type(function_input)) + ":" + str(function_input))
function_input = self._state_utils.applyParameters(function_input)
#self._logger.debug("[FunctionWorker] User code input(Before ItemsPath processing):" + str(type(function_input)) + ":" + str(function_input))
function_input = self._state_utils.applyItemsPath(function_input) # process map items path

#elif "Action" not in metadata or metadata["Action"] != "post_parallel_processing":
# function_input = self._state_utils.applyInputPath(raw_state_input)

else:
function_input = raw_state_input
except Exception as exc:
self._logger.exception("InputPath processing exception: %s\n%s", str(instance_pid), str(exc))
error_type = "InputPath processing exception"
Expand Down Expand Up @@ -592,6 +604,7 @@ def _get_and_handle_message(self):

def run(self):
self._is_running = True

self._logger.info("[FunctionWorker] Started:" \
+ self._function_state_name \
+ ", user: " + self._userid \
Expand Down
7 changes: 7 additions & 0 deletions FunctionWorker/python/PublicationUtils.py
Original file line number Diff line number Diff line change
Expand Up @@ -356,6 +356,13 @@ def _generate_trigger_metadata(self, topic_next):

output_instance_id = self._output_counter_map[topic_next]
next_function_execution_id = self._metadata["__function_execution_id"] + "_" + str(output_instance_id)

# get current state type. if map state add marker to execution Id
state_type = self._state_utils.functionstatetype
self._logger.debug("self._state_utils.functionstatetype: " + str(state_type))

if state_type == 'Map':
next_function_execution_id = self._metadata["__function_execution_id"] + "_" + str(output_instance_id)+"-M"
self._output_counter_map[topic_next] += 1

trigger_metadata = copy.deepcopy(self._metadata)
Expand Down
700 changes: 670 additions & 30 deletions FunctionWorker/python/StateUtils.py

Large diffs are not rendered by default.

2 changes: 2 additions & 0 deletions GUI/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
app/pages/docs/sdk/sdk.md
app/pages/docs/sdk/cli.md
4 changes: 3 additions & 1 deletion GUI/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,9 @@ app/pages/docs/sdk/cli.md: ../mfn_cli/README.md
-include ../proxy.mk
include ../docker.mk

image: $(shell find ./ -type f|grep -v Makefile)
image: $(shell find ./ -type f|grep -v -e Makefile -e app/pages/docs/sdk/sdk.md -e app/pages/docs/sdk/cli.md) \
app/pages/docs/sdk/cli.md \
app/pages/docs/sdk/sdk.md
$(call build_image,Dockerfile,microfn/nginx)

push: image
Expand Down
4 changes: 4 additions & 0 deletions GUI/app/pages/workflows/WorkflowEditorCtrl.js
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,9 @@
var fail = $.getJSON("lib/asl-validator/schemas/fail.json", function(json) {
});

var mapState = $.getJSON("lib/asl-validator/schemas/map.json", function(json) {
});

var parallel = $.getJSON("lib/asl-validator/schemas/parallel.json", function(json) {
});

Expand Down Expand Up @@ -1209,6 +1212,7 @@
choice.responseJSON,
fail.responseJSON,
parallel.responseJSON,
mapState.responseJSON,
pass.responseJSON,
stateMachine.responseJSON,
state.responseJSON,
Expand Down
95 changes: 95 additions & 0 deletions GUI/lib/asl-validator/schemas/map.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,95 @@
{
"$id": "http://asl-validator.cloud/map#",
"$schema": "http://json-schema.org/draft-07/schema#",
"type": "object",
"properties": {
"Type": {
"type": "string",
"pattern": "^Map$"
},
"Next": {
"type": "string"
},
"End": {
"enum": [true]
},
"Comment": {
"type": "string"
},
"OutputPath": {
"type": ["string", "null"]
},
"InputPath": {
"type": ["string", "null"]
},
"ResultPath": {
"type": ["string", "null"]
},
"ItemsPath": {
"type": ["string", "null"]
},
"MaxConcurrency": {
"type": "number",
"minimum": 0
},
"Iterator": {
"$ref": "http://asl-validator.cloud/state-machine#"
},
"Parameters": {
"type": "object"
},
"Retry": {
"type": "array",
"items": {
"types": "object",
"properties": {
"ErrorEquals": {
"type": "array",
"items": {
"types": "string"
}
},
"IntervalSeconds": {
"type": "number",
"minimum": 0
},
"MaxAttempts": {
"type": "number",
"minimum": 0
},
"BackoffRate": {
"type": "number",
"minimum": 0
}
},
"required": ["ErrorEquals"]
}
},
"Catch": {
"type": "array",
"items": {
"types": "object",
"properties": {
"ErrorEquals": {
"type": "array",
"items": {
"types": "string"
}
},
"Next": {
"type": "string"
}
},
"required": ["ErrorEquals", "Next"]
}
}
},
"oneOf": [{
"required": ["Next"]
}, {
"required": ["End"]
}],
"required": ["Type", "Iterator"],
"additionalProperties": false
}

3 changes: 3 additions & 0 deletions GUI/lib/asl-validator/schemas/pass.json
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,9 @@
"ResultPath": {
"type": "string"
},
"Parameters": {
"type": "object"
},
"Result": {}
},
"oneOf": [{
Expand Down
3 changes: 3 additions & 0 deletions GUI/lib/asl-validator/schemas/state.json
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,9 @@
{
"$ref": "http://asl-validator.cloud/parallel#"
},
{
"$ref": "http://asl-validator.cloud/map#"
},
{
"$ref": "http://asl-validator.cloud/pass#"
},
Expand Down
2 changes: 1 addition & 1 deletion ManagementService/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ thrift: $(THRIFT)

$(THRIFT): ../DataLayerService/thrift/DataLayerMessage.thrift ../DataLayerService/thrift/DataLayerService.thrift
mkdir -p data_layer
docker run --rm -v $(CURDIR)/..:/root -w /root thrift:0.12.0 bash -c '\
docker run --user $$(id -u):$$(id -g) --rm -v $(CURDIR)/..:/root -w /root thrift:0.12.0 bash -c '\
thrift --gen py -out ManagementService/ DataLayerService/thrift/DataLayerMessage.thrift; \
thrift --gen py -out ManagementService/ DataLayerService/thrift/DataLayerService.thrift'

Expand Down
14 changes: 13 additions & 1 deletion ManagementService/management_init.py
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,7 @@ def parse_states(state_map):

if stype == "Task":
functions.append(wfs["Resource"])

elif stype == 'Parallel':
# add the parallel state function worker
states.append({'name':sresource,'type':stype})
Expand All @@ -83,7 +84,18 @@ def parse_states(state_map):
# add found functions to list of function workers
functions.extend(sub_functions)
states.extend(sub_states)
elif stype in {'Choice', 'Pass', 'Wait', 'Fail', 'Succeed'}: #, 'Parallel'}:

elif stype == 'Map':
# add the Map state iterator function worker
states.append({'name':sresource,'type':stype})
# find recursively everything that is in the branch
branch = state_map[sresource]['Iterator']
sub_functions, sub_states = parse_states(branch['States'])
# add found functions to list of function workers
functions.extend(sub_functions)
states.extend(sub_states)

elif stype in {'Choice', 'Pass', 'Wait', 'Fail', 'Succeed'}:
states.append({'name':sresource,'type':stype})
else:
raise Exception("Unknown state type: " + stype)
Expand Down
2 changes: 1 addition & 1 deletion ManagementService/python/deleteWorkflow.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ def handle(value, sapi):
wf = sapi.get(email + "_workflow_" + workflow["id"], True)
if wf is not None and wf != "":
wf = json.loads(wf)
if wf["status"] == "undeployed":
if wf["status"] == "undeployed" or wf["status"] == "failed":
for wn in workflows:
if workflows[wn] == workflow["id"]:
del workflows[wn]
Expand Down
16 changes: 16 additions & 0 deletions ManagementService/python/deployWorkflow.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,22 @@ def check_workflow_functions(wf_type, wfobj, email, sapi):
for state in list(branches['States'].keys()): # state is the key
wf_state_map[state] = branches['States'][state] # add the state to the state map root

if wf_state_map[state_names]["Type"] == "Map":
mapStateName = state_names
iterator = wf_state_map[mapStateName]['Iterator'] # this is a dict

states_dict = iterator['States'] # this is a also dict
print (json.dumps(states_dict))
for state in states_dict.keys():
print ("FOUND MAP STATE: "+str(state))
wf_state_map[state] = states_dict[state]

"""
for iterators in wf_state_map[mapStateName]['Iterator']:
for state in list(iterators['States'].keys()): # state is the key
wf_state_map[state] = iterators['States'][state] # add the state to the state map root
"""

for wfsname in wf_state_map:
wfs = wf_state_map[wfsname]
if wfs["Type"] == "Task":
Expand Down
6 changes: 3 additions & 3 deletions ManagementService/python/getWorkflows.py
Original file line number Diff line number Diff line change
Expand Up @@ -138,9 +138,9 @@ def handle(value, sapi):
try:
workflows = sapi.get(email + "_list_workflows", True)
if workflows is None or workflows == "":
raise Exception("Couldn't retrieve workflow status; no such workflow.")

workflows = json.loads(workflows)
workflows = {}
else:
workflows = json.loads(workflows)

# get single workflow status
if "workflow" in data and "id" in data["workflow"]:
Expand Down
Loading