You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
org.apache.spark.api.python.PythonException: Traceback (most recent call last):
File "/root/spark/python/pyspark/worker.py", line 79, in main
serializer.dump_stream(func(split_index, iterator), outfile)
File "/root/spark/python/pyspark/serializers.py", line 196, in dump_stream
self.serializer.dump_stream(self._batched(iterator), stream)
File "/root/spark/python/pyspark/serializers.py", line 127, in dump_stream
for obj in iterator:
File "/root/spark/python/pyspark/serializers.py", line 185, in _batched
for item in iterator:
File "/root/spark/python/pyspark/rdd.py", line 1148, in takeUpToNumLeft
yield next(iterator)
File "thunder/rdds/fileio/seriesloader.py", line 362, in readblockfromtif
File "build/bdist.linux-x86_64/egg/thunder/rdds/fileio/readers.py", line 448, in close
self._key.close(fast=True)
TypeError: close() got an unexpected keyword argument 'fast'
The text was updated successfully, but these errors were encountered:
This is a boto versioning problem - the ec2 cluster is provisioned (apparently) with boto 2.8.0, where it appears the "fast" keyword in close() doesn't exist yet. I'll get rid of that keyword and anything else that crept in requiring newer versions of boto.
Running on an ec2 cluster launched via the thunder-ec2 script...
some output happens, then:
The text was updated successfully, but these errors were encountered: