Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Instantiates a new scatter plot set resolutionX=1920 and resolutionY = 1080 or bigger value,it will be OutOfMemory #234

konvish opened this issue May 15, 2018 · 0 comments


Copy link

commented May 15, 2018

Expected behavior

Actual behavior

Steps to reproduce the problem


GeoSpark version = 1.1.3

Apache Spark version = 2.2.1

JRE version = 1.8

API type = Java

I set the resolutionX value from 1000 to 2000 step by 200,when the value more than 1600,it often occur out of memory error. I found error positioning error, the Max function.
I change the max function to reduce function,and it no more happen out of memory error.
I see the Max function and reduce the function of found that Max function call last and reduce function, I try to reduce function and Max function and treeReduce, reduce more efficient, so I suggest change Max function to reduce function,Reduce is a large amount of data, specially for treeReduce applies to a single reduce large amount of data.

function times
max 234ms
reduce 40ms
treeReduce 286ms
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
None yet
2 participants
You can’t perform that action at this time.