Skip to content

Commit

Permalink
add documentation for the iter_pages function
Browse files Browse the repository at this point in the history
  • Loading branch information
pcardune committed Jan 15, 2012
1 parent 05794a6 commit a7b2a3e
Show file tree
Hide file tree
Showing 2 changed files with 23 additions and 4 deletions.
18 changes: 18 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -86,6 +86,24 @@ profile picture, you can use the `graph_url` function:
profile_pic = graph_url('/zuck/picture')
urlretrieve(profile_pic, 'zuck.jpg')


### Advanced Graph API ###

fbconsole also provides access and utilities around some more advanced graph api
features.


__iter_pages__

If you are trying to fetch a lot of data, you may be required to make multiple
requests to the graph api via the "paging" values that are sent back. You can
use `iter_pages` to automatically iterate through multiple requests. For
example, you can iterate through all your wall posts:

for post in iter_pages(fbconsole.get('/me/posts')):
print post['message']


### More Authentication Options ###

By default, fbconsole will make all it's requests as the fbconsole facebook app.
Expand Down
9 changes: 5 additions & 4 deletions src/fbconsole.py
Original file line number Diff line number Diff line change
Expand Up @@ -244,10 +244,11 @@ def get(path, params=None):
def iter_pages(json_response):
"""Iterate over multiple pages of data.
The graph api can return a lot of data, but will only return so much data in
a single request. To get more data, you must query for it explicitly using
paging. This function will do automatic paging in the form of an iterator.
For example to print the id of every photo tagged with the logged in user:
The graph api can return a lot of data, but will only return a limited
amount of data in a single request. To get more data, you must query for it
explicitly using paging. This function will do automatic paging in the form
of an iterator. For example to print the id of every photo tagged with the
logged in user:
>>> total = 0
>>> for photo in iter_pages(get('/19292868552/feed', {'limit':2})):
Expand Down

0 comments on commit a7b2a3e

Please sign in to comment.