Skip to content

Commit

Permalink
Improve documentation for wait_on_rate_limit parameter for streaming
Browse files Browse the repository at this point in the history
Resolves part of #1986
  • Loading branch information
Harmon758 committed Oct 22, 2022
1 parent 3a71c9e commit 7f0d587
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 2 deletions.
4 changes: 3 additions & 1 deletion tweepy/asynchronous/streaming.py
Expand Up @@ -570,7 +570,9 @@ class AsyncStreamingClient(AsyncBaseClient, AsyncBaseStream):
return_type : type[dict | requests.Response | Response]
Type to return from requests to the API
wait_on_rate_limit : bool
Whether to wait when rate limit is reached
Whether or not to wait before retrying when a rate limit is
encountered. This applies to requests besides those that connect to a
stream (see ``max_retries``).
max_retries: int | None
Number of times to attempt to (re)connect the stream.
proxy : str | None
Expand Down
4 changes: 3 additions & 1 deletion tweepy/streaming.py
Expand Up @@ -563,7 +563,9 @@ class StreamingClient(BaseClient, BaseStream):
return_type : type[dict | requests.Response | Response]
Type to return from requests to the API
wait_on_rate_limit : bool
Whether to wait when rate limit is reached
Whether or not to wait before retrying when a rate limit is
encountered. This applies to requests besides those that connect to a
stream (see ``max_retries``).
chunk_size : int
The default socket.read size. Default to 512, less than half the size
of a Tweet so that it reads Tweets with the minimal latency of 2 reads
Expand Down

0 comments on commit 7f0d587

Please sign in to comment.