Skip to content

elasticsearch spark size option to limit the number of documents returned #546

Closed
@kim333

Description

@kim333

As @costin mentioned at the post #469, I have been trying to use batch.size setting for controlling the size of the return number of documents from elasticsearch to RDD.

However, when I looked at the configurations, the only things I can find with batch.size were es.batch.size.bytes and es.batch.size.entries which don't seem to be the option to limit the document numbers returned from elasticsearch. Also when I tried these options SparkES didn't limit the document results.

What is the option to limit the number of documents returned from elasticsearch-spark?
Thanks

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions