Skip to content

06.AutoML_Databricks_local got error ssl3_get_server_certificate #139

@Exlsunshine

Description

@Exlsunshine

Got error after executing the following codes(06.AutoML_Databricks_local):

local_run = experiment.submit(automl_config, show_output = True) # for higher runs please use show_output=False and use the below
Running on ADB cluster experiment automl-local-classification.Parent Run ID: AutoML_cf8bb694-d6b1-4615-ac67-3d8ead5d51e7
*******************************************************************************************
ITERATION: The iteration being evaluated.
PIPELINE: A summary description of the pipeline being evaluated.
DURATION: Time taken for the current iteration.
METRIC: The result of computing score on the fitted pipeline.
BEST: The best observed score thus far.
*******************************************************************************************

 ITERATION   PIPELINE                                       DURATION      METRIC      BEST
Exception in thread ADB Experiment: automl-local-classification:
Traceback (most recent call last):
  File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner
    self.run()
  File "/databricks/python/lib/python3.5/site-packages/azureml/train/automl/_adb_driver_node.py", line 36, in run
    automlRDD.map(adb_run_experiment).collect()
  File "/databricks/spark/python/pyspark/rdd.py", line 837, in collect
    sock_info = self.ctx._jvm.PythonRDD.collectAndServe(self._jrdd.rdd())
  File "/databricks/spark/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__
    answer, self.gateway_client, self.target_id, self.name)
  File "/databricks/spark/python/pyspark/sql/utils.py", line 63, in deco
    return f(*a, **kw)
  File "/databricks/spark/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value
    format(target_id, ".", name), value)
py4j.protocol.Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.collectAndServe.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 3.0 failed 4 times, most recent failure: Lost task 0.3 in stage 3.0 (TID 78, 10.139.64.8, executor 0): org.apache.spark.api.python.PythonException: Traceback (most recent call last):
  File "/databricks/python3/lib/python3.5/site-packages/urllib3/contrib/pyopenssl.py", line 444, in wrap_socket
    cnx.do_handshake()
  File "/databricks/python3/lib/python3.5/site-packages/OpenSSL/SSL.py", line 1907, in do_handshake
    self._raise_ssl_error(self._ssl, result)
  File "/databricks/python3/lib/python3.5/site-packages/OpenSSL/SSL.py", line 1639, in _raise_ssl_error
    _raise_current_error()
  File "/databricks/python3/lib/python3.5/site-packages/OpenSSL/_util.py", line 54, in exception_from_error_queue
    raise exception_type(errors)
OpenSSL.SSL.Error: [('SSL routines', 'ssl3_get_server_certificate', 'certificate verify failed')]

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/databricks/python3/lib/python3.5/site-packages/urllib3/connectionpool.py", line 600, in urlopen
    chunked=chunked)
  File "/databricks/python3/lib/python3.5/site-packages/urllib3/connectionpool.py", line 343, in _make_request
    self._validate_conn(conn)
  File "/databricks/python3/lib/python3.5/site-packages/urllib3/connectionpool.py", line 849, in _validate_conn
    conn.connect()
  File "/databricks/python3/lib/python3.5/site-packages/urllib3/connection.py", line 356, in connect
    ssl_context=context)
  File "/databricks/python3/lib/python3.5/site-packages/urllib3/util/ssl_.py", line 359, in ssl_wrap_socket
    return context.wrap_socket(sock, server_hostname=server_hostname)
  File "/databricks/python3/lib/python3.5/site-packages/urllib3/contrib/pyopenssl.py", line 450, in wrap_socket
    raise ssl.SSLError('bad handshake: %r' % e)
ssl.SSLError: ("bad handshake: Error([('SSL routines', 'ssl3_get_server_certificate', 'certificate verify failed')],)",)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/databricks/python3/lib/python3.5/site-packages/requests/adapters.py", line 449, in send
    timeout=timeout
  File "/databricks/python3/lib/python3.5/site-packages/urllib3/connectionpool.py", line 638, in urlopen
    _stacktrace=sys.exc_info()[2])
  File "/databricks/python3/lib/python3.5/site-packages/urllib3/util/retry.py", line 398, in increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='management.azure.com', port=443): Max retries exceeded with url: /subscriptions/70097067-53c6-4f5c-8b88-bdbbf7d57f43/resourceGroups/test_resource_group2/providers/Microsoft.MachineLearningServices/workspaces/test_workspace2?api-version=2018-03-01-preview (Caused by SSLError(SSLError("bad handshake: Error([('SSL routines', 'ssl3_get_server_certificate', 'certificate verify failed')],)",),))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/databricks/spark/python/pyspark/worker.py", line 262, in main
    process()
  File "/databricks/spark/python/pyspark/worker.py", line 257, in process
    serializer.dump_stream(func(split_index, iterator), outfile)
  File "/databricks/spark/python/pyspark/serializers.py", line 373, in dump_stream
    vs = list(itertools.islice(iterator, batch))
  File "/databricks/spark/python/pyspark/util.py", line 55, in wrapper
    return f(*args, **kwargs)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/train/automl/_adb_run_experiment.py", line 68, in adb_run_experiment
    worker_id)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/train/automl/_adb_run_experiment.py", line 133, in __init__
    self.run_history_client = self._create_client(RUNHISTORY_CLIENT)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/train/automl/_adb_run_experiment.py", line 318, in _create_client
    user_agent=client_type)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_restclient/experiment_client.py", line 42, in __init__
    **kwargs)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_restclient/workspace_client.py", line 40, in __init__
    self._service_context = ServiceContext(subscription_id, resource_group, workspace_name, workspace_id, auth)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_history/service_context.py", line 154, in __init__
    self._endpoints = self._fetch_endpoints()
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_history/service_context.py", line 241, in _fetch_endpoints
    service_name=discovery_key)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_base_sdk_common/service_discovery.py", line 59, in get_service_url
    unique_id=workspace_id)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_base_sdk_common/service_discovery.py", line 201, in get_cached_service_url
    return self.get_cached_flight(arm_scope, service_name, flight, unique_id=unique_id)[service_name]
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_base_sdk_common/service_discovery.py", line 116, in wrapper
    return test_function(self, *args, **kwargs)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_base_sdk_common/service_discovery.py", line 182, in get_cached_flight
    cache[cache_key][flight] = super(CachedServiceDiscovery, self).get_flight(arm_scope, flight)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_base_sdk_common/service_discovery.py", line 74, in get_flight
    discovery_url = self.get_discovery_url(arm_scope)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_base_sdk_common/service_discovery.py", line 89, in get_discovery_url
    resource = self._get_team_resource(arm_scope)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_base_sdk_common/service_discovery.py", line 99, in _get_team_resource
    status = requests.get(urljoin(arm_endpoint, arm_scope), headers=headers, params=query_parameters)
  File "/databricks/python3/lib/python3.5/site-packages/requests/api.py", line 75, in get
    return request('get', url, params=params, **kwargs)
  File "/databricks/python3/lib/python3.5/site-packages/requests/api.py", line 60, in request
    return session.request(method=method, url=url, **kwargs)
  File "/databricks/python3/lib/python3.5/site-packages/requests/sessions.py", line 533, in request
    resp = self.send(prep, **send_kwargs)
  File "/databricks/python3/lib/python3.5/site-packages/requests/sessions.py", line 646, in send
    r = adapter.send(request, **kwargs)
  File "/databricks/python3/lib/python3.5/site-packages/requests/adapters.py", line 514, in send
    raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='management.azure.com', port=443): Max retries exceeded with url: /subscriptions/70097067-53c6-4f5c-8b88-bdbbf7d57f43/resourceGroups/test_resource_group2/providers/Microsoft.MachineLearningServices/workspaces/test_workspace2?api-version=2018-03-01-preview (Caused by SSLError(SSLError("bad handshake: Error([('SSL routines', 'ssl3_get_server_certificate', 'certificate verify failed')],)",),))

	at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.handlePythonException(PythonRunner.scala:317)
	at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRunner.scala:457)
	at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRunner.scala:440)
	at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:271)
	at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
	at scala.collection.Iterator$class.foreach(Iterator.scala:893)
	at org.apache.spark.InterruptibleIterator.foreach(InterruptibleIterator.scala:28)
	at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
	at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
	at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
	at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310)
	at org.apache.spark.InterruptibleIterator.to(InterruptibleIterator.scala:28)
	at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
	at org.apache.spark.InterruptibleIterator.toBuffer(InterruptibleIterator.scala:28)
	at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
	at org.apache.spark.InterruptibleIterator.toArray(InterruptibleIterator.scala:28)
	at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:951)
	at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:951)
	at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2181)
	at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2181)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:112)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:384)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

Driver stacktrace:
	at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1747)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1735)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1734)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
	at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1734)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:962)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:962)
	at scala.Option.foreach(Option.scala:257)
	at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:962)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1970)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1918)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1906)
	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
	at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:759)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2141)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2162)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2181)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2206)
	at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:951)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:375)
	at org.apache.spark.rdd.RDD.collect(RDD.scala:950)
	at org.apache.spark.api.python.PythonRDD$.collectAndServe(PythonRDD.scala:196)
	at org.apache.spark.api.python.PythonRDD.collectAndServe(PythonRDD.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:380)
	at py4j.Gateway.invoke(Gateway.java:295)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.GatewayConnection.run(GatewayConnection.java:251)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.spark.api.python.PythonException: Traceback (most recent call last):
  File "/databricks/python3/lib/python3.5/site-packages/urllib3/contrib/pyopenssl.py", line 444, in wrap_socket
    cnx.do_handshake()
  File "/databricks/python3/lib/python3.5/site-packages/OpenSSL/SSL.py", line 1907, in do_handshake
    self._raise_ssl_error(self._ssl, result)
  File "/databricks/python3/lib/python3.5/site-packages/OpenSSL/SSL.py", line 1639, in _raise_ssl_error
    _raise_current_error()
  File "/databricks/python3/lib/python3.5/site-packages/OpenSSL/_util.py", line 54, in exception_from_error_queue
    raise exception_type(errors)
OpenSSL.SSL.Error: [('SSL routines', 'ssl3_get_server_certificate', 'certificate verify failed')]

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/databricks/python3/lib/python3.5/site-packages/urllib3/connectionpool.py", line 600, in urlopen
    chunked=chunked)
  File "/databricks/python3/lib/python3.5/site-packages/urllib3/connectionpool.py", line 343, in _make_request
    self._validate_conn(conn)
  File "/databricks/python3/lib/python3.5/site-packages/urllib3/connectionpool.py", line 849, in _validate_conn
    conn.connect()
  File "/databricks/python3/lib/python3.5/site-packages/urllib3/connection.py", line 356, in connect
    ssl_context=context)
  File "/databricks/python3/lib/python3.5/site-packages/urllib3/util/ssl_.py", line 359, in ssl_wrap_socket
    return context.wrap_socket(sock, server_hostname=server_hostname)
  File "/databricks/python3/lib/python3.5/site-packages/urllib3/contrib/pyopenssl.py", line 450, in wrap_socket
    raise ssl.SSLError('bad handshake: %r' % e)
ssl.SSLError: ("bad handshake: Error([('SSL routines', 'ssl3_get_server_certificate', 'certificate verify failed')],)",)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/databricks/python3/lib/python3.5/site-packages/requests/adapters.py", line 449, in send
    timeout=timeout
  File "/databricks/python3/lib/python3.5/site-packages/urllib3/connectionpool.py", line 638, in urlopen
    _stacktrace=sys.exc_info()[2])
  File "/databricks/python3/lib/python3.5/site-packages/urllib3/util/retry.py", line 398, in increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='management.azure.com', port=443): Max retries exceeded with url: /subscriptions/70097067-53c6-4f5c-8b88-bdbbf7d57f43/resourceGroups/test_resource_group2/providers/Microsoft.MachineLearningServices/workspaces/test_workspace2?api-version=2018-03-01-preview (Caused by SSLError(SSLError("bad handshake: Error([('SSL routines', 'ssl3_get_server_certificate', 'certificate verify failed')],)",),))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/databricks/spark/python/pyspark/worker.py", line 262, in main
    process()
  File "/databricks/spark/python/pyspark/worker.py", line 257, in process
    serializer.dump_stream(func(split_index, iterator), outfile)
  File "/databricks/spark/python/pyspark/serializers.py", line 373, in dump_stream
    vs = list(itertools.islice(iterator, batch))
  File "/databricks/spark/python/pyspark/util.py", line 55, in wrapper
    return f(*args, **kwargs)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/train/automl/_adb_run_experiment.py", line 68, in adb_run_experiment
    worker_id)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/train/automl/_adb_run_experiment.py", line 133, in __init__
    self.run_history_client = self._create_client(RUNHISTORY_CLIENT)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/train/automl/_adb_run_experiment.py", line 318, in _create_client
    user_agent=client_type)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_restclient/experiment_client.py", line 42, in __init__
    **kwargs)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_restclient/workspace_client.py", line 40, in __init__
    self._service_context = ServiceContext(subscription_id, resource_group, workspace_name, workspace_id, auth)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_history/service_context.py", line 154, in __init__
    self._endpoints = self._fetch_endpoints()
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_history/service_context.py", line 241, in _fetch_endpoints
    service_name=discovery_key)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_base_sdk_common/service_discovery.py", line 59, in get_service_url
    unique_id=workspace_id)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_base_sdk_common/service_discovery.py", line 201, in get_cached_service_url
    return self.get_cached_flight(arm_scope, service_name, flight, unique_id=unique_id)[service_name]
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_base_sdk_common/service_discovery.py", line 116, in wrapper
    return test_function(self, *args, **kwargs)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_base_sdk_common/service_discovery.py", line 182, in get_cached_flight
    cache[cache_key][flight] = super(CachedServiceDiscovery, self).get_flight(arm_scope, flight)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_base_sdk_common/service_discovery.py", line 74, in get_flight
    discovery_url = self.get_discovery_url(arm_scope)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_base_sdk_common/service_discovery.py", line 89, in get_discovery_url
    resource = self._get_team_resource(arm_scope)
  File "/databricks/python3/lib/python3.5/site-packages/azureml/_base_sdk_common/service_discovery.py", line 99, in _get_team_resource
    status = requests.get(urljoin(arm_endpoint, arm_scope), headers=headers, params=query_parameters)
  File "/databricks/python3/lib/python3.5/site-packages/requests/api.py", line 75, in get
    return request('get', url, params=params, **kwargs)
  File "/databricks/python3/lib/python3.5/site-packages/requests/api.py", line 60, in request
    return session.request(method=method, url=url, **kwargs)
  File "/databricks/python3/lib/python3.5/site-packages/requests/sessions.py", line 533, in request
    resp = self.send(prep, **send_kwargs)
  File "/databricks/python3/lib/python3.5/site-packages/requests/sessions.py", line 646, in send
    r = adapter.send(request, **kwargs)
  File "/databricks/python3/lib/python3.5/site-packages/requests/adapters.py", line 514, in send
    raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='management.azure.com', port=443): Max retries exceeded with url: /subscriptions/70097067-53c6-4f5c-8b88-bdbbf7d57f43/resourceGroups/test_resource_group2/providers/Microsoft.MachineLearningServices/workspaces/test_workspace2?api-version=2018-03-01-preview (Caused by SSLError(SSLError("bad handshake: Error([('SSL routines', 'ssl3_get_server_certificate', 'certificate verify failed')],)",),))

	at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.handlePythonException(PythonRunner.scala:317)
	at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRunner.scala:457)
	at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRunner.scala:440)
	at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:271)
	at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
	at scala.collection.Iterator$class.foreach(Iterator.scala:893)
	at org.apache.spark.InterruptibleIterator.foreach(InterruptibleIterator.scala:28)
	at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
	at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
	at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
	at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310)
	at org.apache.spark.InterruptibleIterator.to(InterruptibleIterator.scala:28)
	at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
	at org.apache.spark.InterruptibleIterator.toBuffer(InterruptibleIterator.scala:28)
	at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
	at org.apache.spark.InterruptibleIterator.toArray(InterruptibleIterator.scala:28)
	at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:951)
	at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:951)
	at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2181)
	at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2181)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:112)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:384)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	... 1 more

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions