New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade to latest Travis CI infrastructure #11986

Closed
mattab opened this Issue Aug 30, 2017 · 7 comments

Comments

Projects
None yet
3 participants
@mattab
Member

mattab commented Aug 30, 2017

-> Goal of this issue is to use the latest Travis CI infrastructure documented here

Current status

Recently our builds were running on the latest Travis CI infrastructure and they were failing with the message below. For now we reverted to use the older infrastructure in 485f61e and 6846058

First error on new distribution

Error downloading object: tests/UI/expected-screenshots/Login_password_reset.png (8c6d242): Smudge error: Error downloading tests/UI/expected-screenshots/Login_password_reset.png (8c6d242ab1e0cd44333e359adf6f3b6c49123a41b0cf3712e7afa2ea6121bca6): batch response: Rate limit exceeded: https://github.com/piwik/piwik.git/info/lfs/objects/batch

from here: https://travis-ci.org/piwik/piwik/jobs/269029296

Timeline

As travis explains it seems that we will need to upgrade in the next few weeks:

By the end of September 2017, we hope to have 100% of incoming configurations with dist: precise routed to the sudo-enabled infrastructure, and to retire the container-based infrastructure. We have no immediate plans to discontinue support for jobs that specify sudo: required and dist: precise, although we do not plan to update the images for these jobs any more.

@mattab mattab added this to the 3.2.0 milestone Aug 30, 2017

@mattab

This comment has been minimized.

Show comment
Hide comment
@mattab

mattab Aug 30, 2017

Member

contacted Github support about the error and they replied:

We have API Rate limits set up for the download of LFS files. I would guess that the clone request is being made as an Unauthenticated request, rather than an Authenticated one. You should be able to change this to use Authentication which would increase those limits. The post below has some guidance on how to set this up:
git-lfs/git-lfs#2133

So maybe we try the solution in the git-lfs issue?

Member

mattab commented Aug 30, 2017

contacted Github support about the error and they replied:

We have API Rate limits set up for the download of LFS files. I would guess that the clone request is being made as an Unauthenticated request, rather than an Authenticated one. You should be able to change this to use Authentication which would increase those limits. The post below has some guidance on how to set this up:
git-lfs/git-lfs#2133

So maybe we try the solution in the git-lfs issue?

@sgiehl

This comment has been minimized.

Show comment
Hide comment
@sgiehl

sgiehl Aug 30, 2017

Member

I think this will be a "bigger" project. The new infrastructure has way more changes to be done.
Skipping the LFS smudge in the beginning wouldn't be a problem, but it then fails afterwards when trying to configure git with username and email and if skipping this the setup of the webserver fails...

Actually I believe it might be most useful to build the new travis setup from scratch and maybe (if possible) with sudo:false

@mneudert would you maybe be keen to do that?

Member

sgiehl commented Aug 30, 2017

I think this will be a "bigger" project. The new infrastructure has way more changes to be done.
Skipping the LFS smudge in the beginning wouldn't be a problem, but it then fails afterwards when trying to configure git with username and email and if skipping this the setup of the webserver fails...

Actually I believe it might be most useful to build the new travis setup from scratch and maybe (if possible) with sudo:false

@mneudert would you maybe be keen to do that?

@mneudert

This comment has been minimized.

Show comment
Hide comment
@mneudert

mneudert Aug 30, 2017

Member

Switching over to a partially sudoless infrastructure sounds like something I would really like to try.

Partially only because I think the ramdisk used in some tests only works that way. But afaik it is possible to activate sudo on a per-job base without the global requirement.

But that might still not solve the problem with the git credentials. Wouldn't it work using some travis secure token to at least get everything up and running until a complete switch can be made?

Member

mneudert commented Aug 30, 2017

Switching over to a partially sudoless infrastructure sounds like something I would really like to try.

Partially only because I think the ramdisk used in some tests only works that way. But afaik it is possible to activate sudo on a per-job base without the global requirement.

But that might still not solve the problem with the git credentials. Wouldn't it work using some travis secure token to at least get everything up and running until a complete switch can be made?

@sgiehl

This comment has been minimized.

Show comment
Hide comment
@sgiehl

sgiehl Aug 30, 2017

Member

Yes. Secure token should work as well. For now the builds are running on old precise infrastructure and that should work until we switch to the new one.

Member

sgiehl commented Aug 30, 2017

Yes. Secure token should work as well. For now the builds are running on old precise infrastructure and that should work until we switch to the new one.

@mattab

This comment has been minimized.

Show comment
Hide comment
@mattab

mattab Nov 19, 2017

Member

Hi @mneudert

Do you maybe have an update on the progress of migrating to the latest Travis architecture in Core and plugins? Thanks!

Member

mattab commented Nov 19, 2017

Hi @mneudert

Do you maybe have an update on the progress of migrating to the latest Travis architecture in Core and plugins? Thanks!

@mneudert

This comment has been minimized.

Show comment
Hide comment
@mneudert

mneudert Nov 22, 2017

Member

I have to admit I sort of forgot this issue :D

Working on it right now with some pull requests incoming soon!

Member

mneudert commented Nov 22, 2017

I have to admit I sort of forgot this issue :D

Working on it right now with some pull requests incoming soon!

@mneudert

This comment has been minimized.

Show comment
Hide comment
@mneudert

mneudert Jan 5, 2018

Member

After the merge of #12405 this should be finally completed. After all there is no longer any build running on the old precise infrastructure 🎉

Member

mneudert commented Jan 5, 2018

After the merge of #12405 this should be finally completed. After all there is no longer any build running on the old precise infrastructure 🎉

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment