New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
dpl-1.9 fails to authenticate on bintray #9314
Comments
To revert to the old behavior, please try: deploy:
edge:
branch: v1.8.47
⋮ # rest |
Thanks @BanzaiMan. I saw the advice in other issues of the same kind but didn't try it because the syntax is rejected by travis lint (both local ruby client and current weblint). But will try. |
For reference, syntax was a bit different in my case since I have multiple deployments : deploy:
- edge:
branch: v1.8.47
provider: xxx
# rest
- edge:
branch: v1.8.47
provider: zzz
# rest I let you close this when it seems right (ie now or if you expect this workaround won't be needed at some point). |
While troubleshooting, I found that the To test: deploy:
provider: bintray
edge:
branch: bintray-unauth
skip_cleanup: true # if your deployment relies on build artifacts
⋮ (Adjust for multi-provider deployment accordingly.) Thanks. |
I try and it fails. I did a bit differently since I triggered the build using the api, not commiting to github.
No really the right spot, but it would be nice to display the |
Thanks for the report. I pushed another commit. Could you test it again? |
Travis-ci broke bintray plugin: travis-ci/travis-ci#9314
would be nice also to see this problem fixed https://github.com/travis-ci/travis-ci/issues/5966 |
@igagis It's best keep this issue focused on the problem at hand. Thanks. |
just did: failed (and branch v1.8.47 still working) |
@fg2it Sorry for the continued troubles. I've pushed another commit. Could you restart the failing build? Thank you! |
The tag v1.8.47 would not change, so there is no need to report the result. |
Hum, mixed feelings here. My .yml has 2 deployment steps:
Building dpl gem locally with source travis-ci/dpl and branch bintray-unauth which is a bit strange since the previous one succeed. Edit: to check it is not some connection issue I started another build. But still fails |
Could you reproduce the custom payload you gave to this build? |
Also, is the hang reproducible? |
I've pushed another commit to the branch. |
Yes it was :
Here is the deploy step : deploy:
- edge:
branch: bintray-unauth
provider: bintray
file: armv7.d
user: fg2it
key:
secure: "FteIGr9J9B93Etvy5PmuAx5cZpB5sCt1Bn7FXf75zcAtWUMQEUHwqc995XrAjPrnFZ9yWMF1kInmRIg8ejbLx0PuBWBhJU60FPwRPm+2M2WM8Shc98DXPIm6A+xKC06yt8rU6/nyYyKrJBEf0kPiOHjnXE4rgqZaLVA+UbfSEuCVclmylxecZYNDvjIVJpDQghp6fCUhMqUjwO9UICZfl9ZhRXBeeZloUhR0qwF9orEtjW8usE6PDWl136tqAqBMQOe06Qz+n9JKPqAKIzdLOrviDKr8uDxA/ZPucuAHYCScq3IuRNwgdzgOKOvlVVDW5+uObEWahKEQHs8e7zC4fKdscq5CsZhuY0rVV8G6UNW3Hj69OL7gefiQ5Bd88QoF0X2/CFPe+5GngehpbMnIAg8iH55pEC5v516w4zaij4lyUlcY/o9foSre8YJaD5UlkfsGCj5HJiOnPLCK6l1rtxVTbol4GYq8+w5kWQhfHzfTp4GzUZuQS12K32kynDt3EV0To3VPpbTxK4+Bdw5BnEo+gYaV9/ynM6BiCqlnNUWi42JzUwpC1MFiesVuEaZ/IgtXeZM9/vnp1WgKZDwqDNfvvv7l+DaLI4h+kx7cgSi6EQraxOLBMu/i76CAh6SXunHIKyIhIo1FbQJ8n2HohTZ6bD3jIbnJz/KEqNkeCyA="
skip_cleanup: true
- edge:
branch: bintray-unauth
provider: bintray
file: arm64.d
user: fg2it
key:
secure: "FteIGr9J9B93Etvy5PmuAx5cZpB5sCt1Bn7FXf75zcAtWUMQEUHwqc995XrAjPrnFZ9yWMF1kInmRIg8ejbLx0PuBWBhJU60FPwRPm+2M2WM8Shc98DXPIm6A+xKC06yt8rU6/nyYyKrJBEf0kPiOHjnXE4rgqZaLVA+UbfSEuCVclmylxecZYNDvjIVJpDQghp6fCUhMqUjwO9UICZfl9ZhRXBeeZloUhR0qwF9orEtjW8usE6PDWl136tqAqBMQOe06Qz+n9JKPqAKIzdLOrviDKr8uDxA/ZPucuAHYCScq3IuRNwgdzgOKOvlVVDW5+uObEWahKEQHs8e7zC4fKdscq5CsZhuY0rVV8G6UNW3Hj69OL7gefiQ5Bd88QoF0X2/CFPe+5GngehpbMnIAg8iH55pEC5v516w4zaij4lyUlcY/o9foSre8YJaD5UlkfsGCj5HJiOnPLCK6l1rtxVTbol4GYq8+w5kWQhfHzfTp4GzUZuQS12K32kynDt3EV0To3VPpbTxK4+Bdw5BnEo+gYaV9/ynM6BiCqlnNUWi42JzUwpC1MFiesVuEaZ/IgtXeZM9/vnp1WgKZDwqDNfvvv7l+DaLI4h+kx7cgSi6EQraxOLBMu/i76CAh6SXunHIKyIhIo1FbQJ8n2HohTZ6bD3jIbnJz/KEqNkeCyA="
skip_cleanup: true I started a new build |
Hi guys, Looks like i have meet the same problem. Switching to branch dpl 1.8.47 did not help me. |
and it fails the same way |
* Changing the Bintray User For testing reasons, using `esaude-ops` instead of `psbrandt` to fix authentication errors during the build. * Fix dpl-1.9 failure to authenticate on bintray *Dpl* is the deploy tool used for continuous deployment. As per this Github issue travis-ci/travis-ci#9314, there are some problems with the latest version, 1.9; the fix now is to use an older version - v1.8.47. A permanent solution is being worked out, as seen in the above link.
*Dpl* is the deploy tool used for continuous deployment. As per this Github issue travis-ci/travis-ci#9314, there are some problems with the latest version, 1.9; the fix now is to use an older version - v1.8.47. A permanent solution is being worked out, as seen in the above link.
Also fix the strange issue with bintray: travis-ci/travis-ci#9314
Also fix the strange issue with bintray: travis-ci/travis-ci#9314
Also fix the strange issue with bintray: travis-ci/travis-ci#9314
Also fix the strange issue with bintray: travis-ci/travis-ci#9314
Also fix the strange issue with bintray: travis-ci/travis-ci#9314
Hi folks, I'm having the same issue here : https://api.travis-ci.org/v3/job/353266390/log.txt The uploads started failing somewhere between March 6th (last working upload) and 12th (first failing upload).
But I still get the same problem :
|
@adejanovski Could you point to a working deployment? Thanks. |
Hi @BanzaiMan, sure thing : https://travis-ci.org/thelastpickle/cassandra-reaper/jobs/349673544
The interesting thing is that although I indicated 1.8.47 in my deploy config, the logs of the last job I ran indicate it's using 1.8.48. |
@adejanovski Is the API key correct in the failing build? https://github.com/thelastpickle/cassandra-reaper/blob/8c9de3c83acbc6666e5cb12fccd2deee5b3b3d92/.travis.yml#L65 |
I just switched back to an "unsecure" api key that is stored in Travis private env variables and it's working again with 1.8.47/48 with :
I think I had followed the steps to encrypt the key correctly though, but hey, it's working now : https://api.travis-ci.org/v3/job/353337207/log.txt |
@dismine Your issue appears to be an incorrect deploy key. |
@BanzaiMan, I already tried to switch the key to new. For appveyor it works, but travis again complains about the key. Plus my issue started not after the switch, but before it. |
@dismine What switch are you talking about? |
@BanzaiMan i mean change of the API key. This was my first guess. To try to revoke the old key and create new one. Again, for appveyor new key works, for travis doesn't. |
@dismine I've double checked this value https://github.com/dismine/Valentina_git/blob/0a27af972d1b8988bcceeee6947028153f9bd7b6/.travis.yml#L98. It appears to start with |
This issue is affecting my team as well. |
@fg2it travis-ci/travis-build#1330 should fix the hang on the second (and subsequent) deploys. |
@BanzaiMan thank you. I just fixed the key, but as with others, for me it works only with branch 1.8.47. |
@dismine Yes, we are aware of that; this is the issue to discuss it. The proposed fix is in the |
@BanzaiMan awsome : this one works. |
Sweet. I'll merge the PR, and have a 1.9.2 release before the end of the week. |
@BanzaiMan nice. Does it mean we will have to drop the edge:
branch: bintray-unauth |
@fg2it Eventually. We'll keep the branch alive for a week or so after the 1.9.2 release, so that you can get off it. |
Temporarily using older bintray-deployer version to work around the travis-bintray-deployer bug: travis-ci/travis-ci#9314
1.9.2 is out. |
@BanzaiMan by the way, tested and working. |
since travis-ci/travis-ci#9314 is fixed
It seems dpl v1.9 still has problems deploying to Bintray in our setup. This issue is somewhat related to travis-ci/travis-ci#9314
Also fix the strange issue with bintray: travis-ci/travis-ci#9314
I didn't change the authentication key (and confirm on Bintray it is still good), so I expect I shouldnot have this deployement problem.
Failing travis log is here (the previous working one, with dpl 1.8.47, is here)
The text was updated successfully, but these errors were encountered: