New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pagination on the rackspace DNS api? #1887
Comments
@thattommyhall yeah. There is some weirdness there I fear. I believe .each should "do the right thing" but map doesn't for reasons I'm not totally certain of. |
At the 100 mark it gives out |
I'm confused |
@thattommyhall - I'm confused too, I fear. @krames - could you help us figure out whats up? Thanks! |
@thattommyhall Sorry about that! Let me see if I can't dig into the issue and figure out what's going on for you. |
@thattommyhall I am currently looking into this. I think there might be a bug in there. Let you know! |
@geemus @thattommyhall In researching this I noticed something interesting. The zones.rb class seems to make a couple extraneous calls to DNS. When you execute the following:
The code actually lazy initializes itself by calling the It would seem to me that the Based on that assumption, I created the following gist -> https://gist.github.com/krames/5800459 This this the best approach? How should this work in fog? Thanks! |
@krames - all probably should not load everything, followed by iteration, I fear. If a user has a really large list (say 10,000) loading/iterating over it all becomes pretty untenable. What we have done else where (ie S3) is to have all get a page and each do the iteration and pass to the block one page at a time. That way you can choose between the behaviors. If we made all just return everything, how would you just get a single page? |
I have submitted a pull request to address this issue. |
If I call
Fog::DNS.new(RACKSPACE_CREDENTIALS).zones.map(&:domain).count
=> 100
Whereas I know I have 317 domains.
I assume the rackspace API does pagination and your provider does not get all the results.
I may have time to fix it myself soon,
Tom Hall
The text was updated successfully, but these errors were encountered: