Skip to content

Conversation

hkethi002
Copy link
Contributor

@hkethi002 hkethi002 commented Sep 20, 2017

Notes

  • New endpoint POST /download/summary
  • request body
[{
    "level": {"projects"|"sessions"|"acquisitions"|"analyses"},
     "_id": {ContainerId}
}]
  • response
{
    "dicom": {
        "count" : 1,
        "mb_total": 23124,
        "_id": "dicom"
    }
    .
    .
    .
}

Review Checklist

  • Tests were added to cover all code changes
  • Documentation was added / updated
  • Code and tests follow standards in CONTRIBUTING.md

@codecov-io
Copy link

codecov-io commented Sep 20, 2017

Codecov Report

Merging #934 into master will increase coverage by 0.02%.
The diff coverage is 92.72%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #934      +/-   ##
==========================================
+ Coverage   90.14%   90.16%   +0.02%     
==========================================
  Files          48       48              
  Lines        6462     6516      +54     
==========================================
+ Hits         5825     5875      +50     
- Misses        637      641       +4
Flag Coverage Δ
#python 90.16% <92.72%> (+0.02%) ⬆️
Impacted Files Coverage Δ
api/api.py 100% <ø> (ø) ⬆️
api/download.py 95.06% <92.72%> (-0.54%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 3a9ade2...d6116b7. Read the comment docs.

@nagem
Copy link
Contributor

nagem commented Sep 25, 2017

Looks good, but we don't actually need the node list returned with the summary results. Are you providing the node list to make the eventual download request more efficient?


try:
result = config.db.command('aggregate', cont_name, pipeline=pipeline)
except Exception as e: # pylint: disable=broad-except
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The string version of the failure would get returned to the API user. In situations like this it's better to internally log the error (log.warning(e)) and then self.abort(500, 'Failure to load summary') or something like that so the user doesn't get sent a mongo error.

api/download.py Outdated
containers = ['acquisitions']
elif level == 'analyses':
cont_query['analyses'] = {'_id': req['_id']}
containers = containers[-1:]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should make sure we have tests if we're going to support file summaries for analysis downloads.

api/download.py Outdated
acquisition_ids = [a['_id'] for a in acquisitions]
parent_ids = [req['_id']] + acquisition_ids

# # Grab analyses and their ids
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We won't want want analyses included in project/session/acquisition download summary (because they aren't included in the actual download), so it's okay to remove this.

@hkethi002
Copy link
Contributor Author

That was the idea, it seemed like an easy place to generate it because we are already aggregating through the files anyways.

@nagem
Copy link
Contributor

nagem commented Sep 25, 2017

Unfortunately that list might be 1000s of acquisitions long for large projects, and the user might choose multiple filters, meaning a frontend join of multiple lists of acquisitions - which would lose the gain in efficiency.

@nagem
Copy link
Contributor

nagem commented Oct 11, 2017

Thanks for making the requested changes, LGTM!

@hkethi002 hkethi002 merged commit 1532a58 into master Oct 11, 2017
@hkethi002 hkethi002 deleted the download-summary branch October 11, 2017 16:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants