Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

Already on GitHub? Sign in to your account

Remove 10k per directory limitation #24

wants to merge 3 commits into


None yet
4 participants

dsnopek commented Sep 4, 2011

So, I know that having 10k objects in a directory isn't a good idea! :-) But I've got to deal with this server/application as it is for the time being.

The Cloud Files limitation is 10k per result set. My branch changes list_directory() to request the next set of objects if it gets exactly 10k results.

It does this by moving most of the functionality of list_directory() into list_directory_internal() with a slightly changed interface (the return value is the number of objects in the result set, and dir_list isn't cleared, it's used as the last item returned). Then in list_directory() it calls list_directory_internal(), looping if necessary.

It's been a long time since I've written any C code, so I hope didn't do anything really stupid in there! In any case, go easy on me. ;-)

Best regards,

list_directory() will now request the next set of objects if it gets …
…exactly 10k (the max returned per request per the Rackspace docs)

ryandub commented Sep 13, 2011

Thanks dsnopek! This issue has bitten me a few times.


ocamler and others added some commits Jun 3, 2012

Merge pull request #1 from ocamler/master
Had problems with marker parameter (when last item in a subdirectory)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment