You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@AnikaC-git would you mind sense-checking the below before I implementing the proposed change in wrapper functions that use .split_into_chunks?
Source of the problem
concept_find() is designed to retrieve concepts by chunks of 99 conceptIds (see CHUNK_SIZE=100 in wrapper.R) from GET /{branch}/concepts. This is due to the URL character limit constraint of GET (:arrow_right: error 414 Request-URI Too Large)
But often the limit on the number of results sent back by snowstorm will be lower than this -- the default is 50, and a user cannot set limit above a hard threshold of 10,000.
Example
# Get 221 concept IDs
concepts <- concept_find(ecl = "<233604007", limit = 300)
# Retrieve those
concepts_batch <- concept_find(
conceptIds = concepts$conceptId,
limit = 50
)
Warning messages:
1:
This server request returned just 50 of a total 99 results.
Please increase the server `limit` to fetch all results.
2:
This server request returned just 50 of a total 100 results.
Please increase the server `limit` to fetch all results.
Currently, the function will display a warning with every chunk.
Say you are asking the function to retried 153 concepts, that means you have 2 chunks, so 2 warnings.
Neither elegant nor self-explanatory!
Users will not understand why this happens.
Note: The same thing will apply to:
concepts_included_in
concepts_descriptions
concepts_map
For the latter 2, it's impossible to predict how many results will be returned for every chunk of size = 100.
Hence forcing the limit to 10000. I don't anticipate this being insufficient, and even if it were, the user would see the warning (though they'd be unable to do much about it).
The text was updated successfully, but these errors were encountered:
I'm very sorry, but I am not sure I understand the problem fully with the information provided. So the API that the wrapper functions access has a query limit and this query limit cannot be changed? And therefore there is a splitter that basically splits results into smaller chunks?
No worries it's a little confusing.
Yes there is a limit on the number of results for every snowstorm API operation. By default it is set at 50. But a user can set the limit anywhere up to a maximum of 10000.
In addition, snomedizer wrapper functions use a chunking mechanism, for two reasons:
to get around the maximum 10000 hard limit on results
also to avoid issues when request parameters are too long, say when requesting too many concepts at once (some API operators accept arrays) that leads to exceeding the HTTP requrest URL character limit - hope that makes sense. I set this chunk size to 100 in snomedizer based on trial and error.
Hope this makes sense - happy to set a time and discuss!
@AnikaC-git would you mind sense-checking the below before I implementing the proposed change in wrapper functions that use .split_into_chunks?
Source of the problem
concept_find()
is designed to retrieve concepts by chunks of 99 conceptIds (see CHUNK_SIZE=100 in wrapper.R) fromGET /{branch}/concepts
. This is due to the URL character limit constraint of GET (:arrow_right: error 414 Request-URI Too Large)But often the
limit
on the number of results sent back by snowstorm will be lower than this -- the default is 50, and a user cannot setlimit
above a hard threshold of 10,000.Example
Currently, the function will display a warning with every chunk.
Say you are asking the function to retried 153 concepts, that means you have 2 chunks, so 2 warnings.
Neither elegant nor self-explanatory!
Users will not understand why this happens.
Note: The same thing will apply to:
For the latter 2, it's impossible to predict how many results will be returned for every chunk of size = 100.
Hence forcing the limit to 10000. I don't anticipate this being insufficient, and even if it were, the user would see the warning (though they'd be unable to do much about it).
The text was updated successfully, but these errors were encountered: