Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
here we have an implementation of
.slice
, a method that takes acursor_id
and a range of time, and it returns all matching events.we also have a
dataframe
method that maps each span over.slice
.TODO:
.slice
makes multiple HTTP requests if the requested span has a large number of events in it. currently, it just concatenates the list of events from each request into one big list. however, the python lib does some restructuring of the data, giving you events on a per-stream basis.each individual response has a payload like
there's an entry in
streams
for every stream involved in the given query. each stream ID maps to an object containing some details about the stream. i have access to those details, but since R parses the JSON into a list, i can't seem to find the stream IDs in the data structure.dataframe
currently returns alist
. i need to figure out how to construct a proper dataframe, but.slice
should be improved firstdataframe
makes an HTTP request per span via.slice
. in python, we do this concurrently via aThreadPool
. i'm under the impression that R is single-threaded, so i'm not sure what to do here. @apclypsr do i have options? it's currently quite slow.