Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Empty URLs When making Async Request #272

Open
seanpdwyer7 opened this issue Dec 12, 2020 · 6 comments
Open

Empty URLs When making Async Request #272

seanpdwyer7 opened this issue Dec 12, 2020 · 6 comments

Comments

@seanpdwyer7
Copy link

seanpdwyer7 commented Dec 12, 2020

When running through an async request with the following script:

 queued_job_ids = []
        try:
            for chunk_ids in split_list(ids, 20):
                queued_job_ids.append(PromotedTweet.queue_async_stats_job(account, chunk_ids, metric_groups,placement = placement, granularity = granularity, start_time = start_time, end_time = end_time).id)
                print(chunk_ids)
        except:
            pass

        print(queued_job_ids)

        time.sleep(30)
        try:
            async_stats_job_results = PromotedTweet.async_stats_job_result(account, job_ids=queued_job_ids)
        except:
            pass

async_data = []
        count=0
        for result in async_stats_job_results:
            #time.sleep(15)
            count+=1
            print(count)
            url = result.url
            print(async_stats_job_results)
            print(url)
            print(result)
            try:
                async_data.append(PromotedTweet.async_stats_job_data(account, url=url))
            except:
                pass

I'm getting the following URL response:

<twitter_ads.cursor.Cursor object at xxxxxx>
None
<Analytics resource at xxxxxx id=xxxxx>

I've seen this happen before and I assume it's latency on Twitter's server-side that is delaying the report longer than what I am waiting for it to be returned.

I'm wondering if anyone else has seen Twitter return empty URLs when making a request and what you've done to ensure that each request has a JSON file that's returned?

@arammaliachi
Copy link

arammaliachi commented Dec 14, 2020

I'm also getting an error since December 12th in the PromotedTweet.async_stats_job_data as follows:

Traceback (most recent call last):
  File "./codebuild-projects/ingest-twitter/ingest_twitter_metrics.py", line 176, in <module>
    data = PromotedTweet.async_stats_job_data(account, url=result.url)
  File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/twitter_ads/analytics.py", line 115, in async_stats_job_data
    response = Request(account.client, 'get', resource.path, domain=domain,
  File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/twitter_ads/http.py", line 68, in perform
    response = self.__oauth_request()
  File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/twitter_ads/http.py", line 105, in __oauth_request
    url = self.__domain() + self._resource
TypeError: can only concatenate str (not "bytes") to str

Seems like the URL is not being composed correctly anymore... ? Specifically my script is hitting the error when I set this variable:

asyncResults = PromotedTweet.async_stats_job_result(account, job_ids=queueIds)
for result in asyncResults:

   # Error in line below
   data = PromotedTweet.async_stats_job_data(account, url=result.url)

@DanCardin
Copy link

My experience is that twitter will absolutely return the url as null in json, which translates to result.url being None.

For...reasons, we do our own polling loop which checks the status of all the jobs in that chunk are 'SUCCESS' before proceeding. See polling, polling2, retrying, etc libraries though.

Keep in mind that the jobs can also move to the "FAILED" status as well, which is how I landed here (our code was only checking that it was no longer PROCESSING, but FAILED jobs also have a null url.

@seanpdwyer7
Copy link
Author

@DanCardin with a polling Loop how long does it typically take your job to finish?

@tushdante
Copy link
Collaborator

One thing to recommend is that we do have built-in retry logic with several options. Additionally, happy to review/accept any PRs to improve the SDK as well.

@seanpdwyer7
Copy link
Author

@tushdante Would you be able to share some examples of how to avoid these issues within Examples? I've utilized a while loop to check and wait for the status. I'm just nervous to deploy on the server we are using because I have seen the while loop get stuck waiting for JSON to be ready.

Wondering if you had any better best practices on how to avoid empty files?

@DanCardin
Copy link

@seanpdwyer7 It varies pretty wildly. I think we have a hard cap at 10 mins and have seen it timeout (more, more recently) a number of times. But sometimes it's much much shorter.

@tushdante I'm not certain if that's relevant here (though feel free to correct me if it is). The 2 api methods available and called above don't poll or block on the completion of the report. The aforementioned logic would just prevent one from going over your api call limit? Unless maybe the _result method will 4xx and the retry_delay and retry_status fields are relevant?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants