Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

EACCES problem with large directory structure #30

Closed
Pe-te opened this issue May 31, 2016 · 5 comments
Closed

EACCES problem with large directory structure #30

Pe-te opened this issue May 31, 2016 · 5 comments

Comments

@Pe-te
Copy link

Pe-te commented May 31, 2016

When I run the .files() call with a large directory tree, it starts giving me a lot of the following error:

{ [Error: EACCES: permission denied, scandir '/serverpath/www/htdocs/attachments/2015/11/02/21/53/26']
  errno: -13,
  code: 'EACCES',
  syscall: 'scandir',
  path: '/serverpath/www/htdocs/attachments/2015/11/02/21/53/26' }

Is there any way to prevent this? The readFilesStream() sounds good, but it opens every file, which I don't want. I just need the file names.

System is a ubuntu machine.

@fshost
Copy link
Owner

fshost commented Jun 1, 2016

It looks like you may be getting these errors due to lack of read permissions for some of your subdirectories. Can you confirm whether this is the case? If so it is possibly similar to issue #11. Also, can you inspect one or more of the problem directories (perhaps the one at the path in the above error) with the Node.js fs.stat method and check what the stat.mode is? It may be that this can be fixed by excluding a certain mode value as was done in PR #13 (see commit 204793b), but I'd need to know what that mode value is for your environment.

@Pe-te
Copy link
Author

Pe-te commented Jun 29, 2016

Sorry, haven't gotten around to test it anymore because we now have a bash script that is using the find command. But it would be still cool to get node-dir running when similar tasks arise.

The files were different on each run, so the same file sometimes worked, sometimes not. And it only happened when the list got very long.

Are you processing all calls in parallel? Maybe it is too many fs.stat calls and there are too many files open at once? Is this possible?

I also tried to reproduce it, worked fine on a Mac, and when I tried to generate the test folders (3 million entries) on my own ubuntu server, the server ran out of disk space. If you want to try, here is my script. Since I couldn't get it to work on linux, I'm not sure it will trigger the problem though.

test-node-dir.zip

@fshost
Copy link
Owner

fshost commented Jul 9, 2016

Thanks for the info. So far I'm not able to reproduce the error. As it seems intermittent and related to processing large dir structures I'm wondering if there may be an underlying issue with Node.js itself. What version of Node.js were you running this under when you got the errors?

@Pe-te
Copy link
Author

Pe-te commented Aug 24, 2016

Don't have access to the machine anymore, but it might also be that folder was mounted as shared drive and this caused the issue. If I come across it again I will find out more, for now it can be closed? Unless you want to implement the recursive not-file-opening approach anyway. :)

@fshost
Copy link
Owner

fshost commented Aug 27, 2016

Will definitely consider the recursive not-file-opening approach. Could you summarize how you'd like to see that work with node-dir in this thread? The more detail the better.

In the meantime I guess I'll close this for now, but am definitely still thinking about your issue. We'll reopen it if we can make some progress.

@fshost fshost closed this as completed Aug 27, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants