Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

429 [Too Many Requests] from Instagram #319

Closed
welcomemax opened this issue Mar 30, 2018 · 16 comments
Closed

429 [Too Many Requests] from Instagram #319

welcomemax opened this issue Mar 30, 2018 · 16 comments

Comments

@welcomemax
Copy link

On one of my sites, I get 429 error on requests to ACCOUNT_PAGE or ACCOUNT_JSON_INFO. Other queries pass without problems. Very similar to ban.
Has anyone else come across?
Is there another way to get user_id?

@liamka
Copy link

liamka commented Mar 30, 2018

Same

Response code is 429

@th4deu
Copy link

th4deu commented Mar 31, 2018

They changed the way you pass the variables.
Now there's a part of the URI that needs to be sent in json format, also, there's a new var called query_hash being sent together.

I adapted the URL in my code for that and it worked fine.

@aik27
Copy link
Contributor

aik27 commented Apr 1, 2018

@th4deu code example, please.

@qunn
Copy link

qunn commented Apr 1, 2018

@th4deu up question, please example code, where get query_hash or how generate?

@qunn
Copy link

qunn commented Apr 1, 2018

may be use url https://www.instagram.com/p/{shotcode}/?__a=1 instead of https://www.instagram.com/graphql/query/?query_id=17852405266163336&shortcode={code}&first=10&after=10 for get comments and likes??

Tests shows what if many query to link 2, anytime response rate limit, but link 1 not noticed

@th4deu
Copy link

th4deu commented Apr 1, 2018

@aik27 and @qunn , they just replaced the "query_id" for "query_hash".
Since I've been testing, its value doesn't change. Here's an example of the called URI (i'm using my own method to call it):

$url_graph = 'https://www.instagram.com/graphql/query/'; $variables = rawurlencode('{"shortcode":"'.$code.'", "first":"1000", "after": "'.$nextPageCode'"}'); $url_graph_query = $url_graph . '?query_hash=33ba35852cb50da46f5b5e889df7d159&variables='.$variables';

It'll generate a code like this: https://www.instagram.com/graphql/query/?query_hash=33ba35852cb50da46f5b5e889df7d159&variables=%7B%22shortcode%22%3A%22BfwTKhUl9v7%22%2C%22first%22%3A22%2C%22after%22%3A%22AQC3SITifpbp7D8eeB4wVJozUXAe_ZRkCCLJ6Th4H-bVshhNiMO_h0kotBT2Tvdud5q9MCalawXV5IYqCBOkrgOatfbOmhq0ocPBHcScVGt-9g%22%7D

Anyway, i've been monitoring my website calls to Instagram "API" yesterday and the rate limit is still being applied, but, since there are a lot of API calls, I'm studying the idea of changing the server's IP each hour to avoid the rate limit. Its limit is being renewed each hour.

@qunn
Copy link

qunn commented Apr 1, 2018

@th4deu Thank you! Your solution a little better, i'm took note your solution, but now i replace in Endpoints.php COMMENTS_BEFORE_COMMENT_ID_BY_CODE old url to my https://www.instagram.com/p/{{shortcode}}/?__a=1 and other code fix, result no problem, after 30 minutes many queries no rate limits.

And my solution for change IP, cURL + proxy socks5 or ipv6, and i serializend array to proxy, i have 10 in list, and random i get proxy, if 429 i take next in array, and this action in round by while do

@th4deu
Copy link

th4deu commented Apr 1, 2018

@qunn, for the first query your endpoint works, but how are you doing for the other ones?

Because for the first query, it was already working for me, but the second one ahead was giving me rate limit error after some queries.

@qunn
Copy link

qunn commented Apr 1, 2018

@th4deu this url i only use for function getMediaCommentsByCode() in Instagram.php, and i modified function where $jsonResponse['data'] to $jsonResponse['graphql'] and other replace where use ['data'] in this function, and i change in Endpoints.php const COMMENTS_BEFORE_COMMENT_ID_BY_CODE to this url https://www.instagram.com/p/{{shortcode}}/?__a=1

It's all what i do, and work is fine, no rate limits and i not use proxyes more 4 hours))) may be i don't understand your question, what you mean first query, this function or other function?

@RainGrid
Copy link
Contributor

RainGrid commented Apr 7, 2018

@qunn , you can only take a limited number of comments from the first query. has_next_page == true means that the post has more comments than can be taken from the 1st query.

@qunn
Copy link

qunn commented Apr 12, 2018

@RainGrid, yes, you right, i try find solutions...) but now this what I need for my project, I'm not stop seek solution for more queries, but this work without 429 limit)

@adiv-phpian
Copy link

There's a way to avoid rate limit, I can able to retrieve 100K likes without any problem. All we need to do is send some prior requests to Instagram to avoid getting rate limit error.

@raiym raiym closed this as completed Nov 26, 2018
@imkimchi
Copy link

imkimchi commented Apr 1, 2019

@muthu-kc what do you exactly mean by sending prior requests? do you just send request to see if response would be 429 and wait for some minutes to send again?

@adiv-phpian
Copy link

adiv-phpian commented Apr 2, 2019

@imkimchi You need to use mobile api cookies with web api. by that way you can avoid rate limits.

@imkimchi
Copy link

imkimchi commented Apr 2, 2019

@muthu-kc that's interesting! how can I get mobile api cookies on pc though?

@tawfek
Copy link

tawfek commented Apr 23, 2019

@imkimchi You need to use mobile api cookies with web api. by that way you can avoid rate limits.

how can i use mobile api cookies and use it with web api endpoint
do u mean some thing like this or what ?
$instagram->setUserAgent("Mozilla/5.0 (Mobile; Windows Phone 8.1; Android 4.0; ARM; Trident/7.0; Touch; rv:11.0; IEMobile/11.0; NOKIA; 909) like iPhone OS 7_0_3 Mac OS X AppleWebKit/537 (KHTML, like Gecko) Mobile Safari/537");

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants