Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix 1805 #2087

Closed
wants to merge 13 commits into from
Closed

Fix 1805 #2087

wants to merge 13 commits into from

Conversation

josenavas
Copy link
Contributor

This grab's @antgonza's code and fixes it so the download through nginx works correctly.

@antgonza @ElDeveloper are you able to do a review?

@josenavas josenavas mentioned this pull request Mar 22, 2017
2 tasks
@coveralls
Copy link

Coverage Status

Coverage increased (+0.01%) to 91.595% when pulling d96da86 on josenavas:fix-1805 into 2795046 on biocore:master.

Copy link
Member

@antgonza antgonza left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you also add to the nginx example the changes you made in the test-env config? Thanks!

for i, (fid, path, data_type) in enumerate(a.filepaths):
# validate access only of the first artifact filepath,
# the rest have the same permissions
if (i == 0 and not vfabu(user, fid)):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add a tests for this case?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

if (i == 0 and not vfabu(user, fid)):
to_add = False
break
if data_type == 'directory':
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

... and one for this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

spath = path[basedir_len:]
to_download.append((path, spath, spath))
else:
to_download.append((path, path, path))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

... and this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure how to add a test, added a comment explaining why

% a.id))

# If we don't have nginx, write a file that indicates this
all_files = '\n'.join(["- %s /protected/%s %s" % (getsize(fp), sfp, n)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I liked having the message "This installation of Qiita was not equipped with nginx, ... ", is it possible to add it?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

According the the mod_zip documentation you cannot have any extra lines, so I don't think it is possible to add that information.

@ElDeveloper
Copy link
Member

👍

@coveralls
Copy link

Coverage Status

Coverage increased (+0.07%) to 91.653% when pulling e90ad50 on josenavas:fix-1805 into 2795046 on biocore:master.


# If we don't have nginx, write a file that indicates this
all_files = '\n'.join(["%s %s /protected/%s %s"
% (compute_checksum(fp), getsize(fp), sfp, n)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

compute_checksum is going to be expensive as it requires reading each file in whole from the filesystem. This is multiplied by the potential number of files collected from the deeply nested if statements. Are you sure this should be executed by tornado?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree with @wasade. Note that this was a test and we are removing.

@coveralls
Copy link

Coverage Status

Coverage increased (+0.07%) to 91.655% when pulling 587e5bf on josenavas:fix-1805 into 2795046 on biocore:master.

@josenavas
Copy link
Contributor Author

Is there anything else needed in this PR?

@antgonza
Copy link
Member

Just to keep everyone in the loop: We realized that in some occasions the zip downloaded will not open in Macs because due to the size of the files it requires unzip >= 6.00. Thus, we decided to host a Mac version of unzip 6.00 in the FTP (cause the only way to install it is via brew), and add a GUI line under the button pointing to new documentation explaining the issue and fix.

@antgonza
Copy link
Member

Closing as I'm gonna create another PR with the requested fixes as @josenavas requested it.

@antgonza antgonza closed this Mar 27, 2017
@ElDeveloper
Copy link
Member

Oh, I think I have seen this before, what about using gzip?

My past experience was that large zip files could not be opened unless I used jar xvf files.zip or something like that.

@antgonza antgonza mentioned this pull request Mar 27, 2017
@antgonza
Copy link
Member

mod_gzip for nginx doesn't really work like mod_zip. In other words, we can't use mod_gzip for what we are doing here ...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants