Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Empty Response and Response Code #477

Open
TigerWolf opened this issue Sep 22, 2015 · 7 comments
Open

Empty Response and Response Code #477

TigerWolf opened this issue Sep 22, 2015 · 7 comments

Comments

@TigerWolf
Copy link

I have some code that makes a request, the first request succeeds and then all subsequent requests return an empty result.

For example:
Typhoeus.get("http://google.com")

I, [2015-09-22T12:56:42.897665 #27282]  INFO -- : Started GET "/v1/library/account" for 0.0.0.0 at 2015-09-22 12:56:42 +0930
D, [2015-09-22T12:56:43.176139 #27282] DEBUG -- : ETHON: Libcurl initialized
D, [2015-09-22T12:56:44.308576 #27282] DEBUG -- : ETHON: performed EASY effective_url=https://theurlwashere.com response_code=200 return_code=ok total_time=1.1307260000000001
I, [2015-09-22T12:56:58.387636 #27282]  INFO -- : Started GET "/v1/library/account" for 0.0.0.0 at 2015-09-22 12:56:58 +0930
D, [2015-09-22T12:56:58.883898 #27282] DEBUG -- : ETHON: performed EASY effective_url=https://theurlwashere.com response_code=0 return_code=ssl_cacert total_time=0.0

I thought it might have been caching so I added a UUID to the url as a parameter, but this did not fix the problem.

This is a server (uat), so after every deploy it works once. My local environment works without this problem.

@AvnerCohen
Copy link
Contributor

Have you tried updating the local libcurl ?
Typheous in this sense is a wrapper on top of libcurl, so if a simple get behaves like that, I'd assume it's something in the environment.

Are you able to pin point the issue maybe to an HTTPS sites? I am asking this because of the ssl_cacert I see being sent back in the 2nd call.

@TigerWolf
Copy link
Author

Here is some information that might help:

$ curl --version
curl 7.15.5 (x86_64-redhat-linux-gnu) libcurl/7.15.5 OpenSSL/0.9.8b zlib/1.2.3 libidn/0.6.5
Protocols: tftp ftp telnet dict ldap http file https ftps
Features: GSS-Negotiate IDN IPv6 Largefile NTLM SSL libz
$ cat /etc/*-release
Red Hat Enterprise Linux Server release 5.11 (Tikanga)
$ curl-config --version
libcurl 7.15.5

@AvnerCohen
Copy link
Contributor

7.15.5 was released on August 7 2006, I would suggest to try and update to a newer version, there has been so much going on since.

Not that it's a sure fix, but since I don't remember seeing this behaviour before + don't remember anyone using that old of a version - it does seem a likely correlation.

@nicolasgarnil
Copy link

This is probably because you have configured a cache strategy so Typhoeus is caching the first request and returning the result in the subsequent requests.

I'm having the same problem but the request is a POST request which should be not cached by Typhoeus.

Any thoughts?

@TigerWolf
Copy link
Author

I dont have a cache strategy configured. I also added a unique id to the url in order to bypass any caching as a test and this did not have any effect.

@ryana
Copy link
Contributor

ryana commented Nov 6, 2015

return_code=ssl_cacert means the SSL cert on the server is probably missing intermediate certs. If theurlwashere is the same in both cases, does the domain resolve to multiple IPs? I bet one machine is setup correctly and one is not.

@ryana
Copy link
Contributor

ryana commented Nov 6, 2015

BTW, I just ran into this with a site I'm working on. Just bought a new SSL cert for ChartURL.com and forgot to add the intermediate certs when I added the site to my nginx proxy.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants