You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have Mage AI deployed to ECS and created a pipeline that fetches some data from a source API, transforms it and then uses the data integration in batch pipelines feature to move data to Snowflake. This works as expected as long as I don't use the ECS executor, but once I use it my pipeline executes successfully until the data integration block and throws the following error:
[2024-07-02 13:57:00 UTC+2:00] ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/mage_ai/cli/main.py", line 241, in run
ExecutorFactory.get_pipeline_executor(
File "/usr/local/lib/python3.10/site-packages/mage_ai/data_preparation/executors/pipeline_executor.py", line 70, in execute
asyncio.run(self.__run_blocks(
File "/usr/local/lib/python3.10/asyncio/runners.py", line 44, in run
return loop.run_until_complete(main)
File "/usr/local/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete
return future.result()
File "/usr/local/lib/python3.10/site-packages/mage_ai/data_preparation/executors/pipeline_executor.py", line 151, in __run_blocks
block_run_outputs = await asyncio.gather(*block_run_tasks)
File "/usr/local/lib/python3.10/site-packages/mage_ai/data_preparation/executors/pipeline_executor.py", line 134, in execute_block
raise error
File "/usr/local/lib/python3.10/site-packages/mage_ai/data_preparation/executors/pipeline_executor.py", line 113, in execute_block
return BlockExecutor(block_run_id=block_run.id, **
File "/usr/local/lib/python3.10/site-packages/mage_ai/data_preparation/executors/block_executor.py", line 499, in execute
for block_run_dict in block_run_dicts:
TypeError: 'NoneType' object is not iterable
[2024-07-02 13:57:00 UTC+2:00] ERROR: Pipeline execution failed due to TypeError: 'NoneType' object is not iterable
By looking into the code I can see that from the Pipeline executor there is never a block_run_dicts passed to the block executor. This seems to cause the issue as the default value is also not even an empty list but None.
To reproduce
No response
Expected behavior
If my observations about the problem have been correct, pass either block_run_dicts to block executor execute method or use an empty list to avoid non being able to iterate.
Screenshots
No response
Operating system
No response
Additional context
No response
The text was updated successfully, but these errors were encountered:
blaueeiner
changed the title
[BUG]
[BUG] Using NoneType block_run_dicts and trying to iterate in BlockExecutor when executing data integration block on ECS executor
Jul 2, 2024
blaueeiner
changed the title
[BUG] Using NoneType block_run_dicts and trying to iterate in BlockExecutor when executing data integration block on ECS executor
[BUG] Using NoneType block_run_dicts and trying to iterate on None in BlockExecutor when executing data integration block on ECS executor
Jul 2, 2024
Mage version
0.9.72
Describe the bug
I have Mage AI deployed to ECS and created a pipeline that fetches some data from a source API, transforms it and then uses the data integration in batch pipelines feature to move data to Snowflake. This works as expected as long as I don't use the ECS executor, but once I use it my pipeline executes successfully until the data integration block and throws the following error:
By looking into the code I can see that from the Pipeline executor there is never a
block_run_dicts
passed to the block executor. This seems to cause the issue as the default value is also not even an empty list butNone
.To reproduce
No response
Expected behavior
If my observations about the problem have been correct, pass either
block_run_dicts
to block executor execute method or use an empty list to avoid non being able to iterate.Screenshots
No response
Operating system
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: