-
Notifications
You must be signed in to change notification settings - Fork 1
memtests failing? #57
Comments
hi, can you show me an example of a cran check that has these failures? |
https://cloud.r-project.org/web/checks/check_results_BuyseTest.html
|
(I downloaded the check pages myself) |
the API does collect additional issues - e.g., https://cranchecks.info/pkgs/BuyseTest - so you can look in that array for each packages data. there is no search for this API though, so you'd have to pull down data for all pkgs and then search across the additional issues. we've been considering adding search but just haven't had enough strong use cases yet note that there's historical data up on Amazon S3 https://github.com/ropenscilabs/cchecksapi/blob/master/docs/api_docs.md#history - each day of checks data is zipped up as newline delimited JSON as a single file - you can just pull those filesl down and read in with also note that we don't collect the text of the individual platform page checks for each package. we only have what's on the package level html page - been thinking about scraping the other pages linked to so that we have all data but just haven't had time to do that |
awesome thanks for the info
…On Mon, May 4, 2020 at 4:59 PM Scott Chamberlain ***@***.***> wrote:
the API does collect additional issues - e.g.,
https://cranchecks.info/pkgs/BuyseTest - so you can look in that array
for each packages data.
there is no search for this API though, so you'd have to pull down data
for all pkgs and then search across the additional issues. we've been
considering adding search but just haven't had enough strong use cases yet
note that there's historical data up on Amazon S3
https://github.com/ropenscilabs/cchecksapi/blob/master/docs/api_docs.md#history
- each day of checks data is zipped up as newline delimited JSON as a
single file - you can just pull those filesl down and read in with
jsonlite::stream_in i think
also note that we don't collect the text of the individual platform page
checks for each package. we only have what's on the package level html page
- been thinking about scraping the other pages linked to so that we have
all data but just haven't had time to do that
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#57 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAHDX4RWH7V6KVJW7HGBN7DRP5JE3ANCNFSM4MZD3AEA>
.
|
anything else? |
it would be more user-friendly to have a search API endpoint, so then I would only have to make one request to make my query (rather than N requests where N is the number of packages, and so it is basically the same complexity as downloading the raw CRAN check pages). so maybe you want to keep this issue open for that, but otherwise it is fine with me to close. |
i agree that search would be most user friendly. it does require more work as you'd imagine. i'll have a look and see if we can do this easily |
- /search route searches in histories table - so doesnt include newest data in mongodb - data searched would be up to ~24 hrs old but could just be 1 hr old e.g. - added new Search activerecord class to handle /search route requests - added minimal /search info to docs
@tdhock just pushed change to the API, now has a search for "memory" https://cranchecks.info/search?q=memory one_each - so you can get one result per package https://cranchecks.info/search?q=memory&one_each=true limit fields returned, date_updated always returned https://cranchecks.info/search?q=memory&one_each=true&fields=package @maelle ^ search added. |
- /search route searches in histories table - so doesnt include newest data in mongodb - data searched would be up to ~24 hrs old but could just be 1 hr old e.g. - added new Search activerecord class to handle /search route requests - added minimal /search info to docs
hi @sckott can we query your db to get a list of all packages for which at least one of the memtests https://www.stats.ox.ac.uk/pub/bdr/memtests/README.txt fails?
The text was updated successfully, but these errors were encountered: