New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Installing / updating dependencies seems very slow #1805
Comments
On my laptop I get 18seconds first run and then 10seconds with a warm cache (json files from packagist are cached). So if it's not network (i.e. if you run it twice in a row and it's still as slow) then I guess it's CPU bound. Not sure what to do about that (of course there might be improvements we can do, but nothing major I'm afraid). Thing is though, you should run update on dev machines which hopefully have decent CPU/ram, and then commit your composer.lock file and run install from that on prod servers. Install from lock is almost instant no matter the machine. |
Thanks for your response - I ran it off the VM (on my laptop directly) and I got 18 seconds first run and 6 seconds following, so I'd agree it's CPU. I also ran the same on my Linode VM and got 25 seconds then 18 seconds. This is clearly related to my VM performance so is not a Composer issue. Out of interest, what is Composer actually doing in that initial "Installing dependencies" bit? It seems very CPU intensive. Does it multithread or is it single threaded? |
It's checking all the requirements of your package and your dependencies to try and find a solution to install. Since it does this in a fairly advanced way unlike some other package manager that took more simple approaches, it can take quite a bit of computing power. Unfortunately php can't do multi-threading, and to be honest I don't know if the problem at hand would be easy to parallelize even if we could. |
Thanks for your time @Seldaek - much appreciated :) |
Is there any way to disable these checks? I have the same problem on Ubuntu 12.04 x86-64 in a KVM virtual machine (Quad-Core Xeon 2.4 GHz, 4 GB). |
@dsbaars resolving the dependencies is the main feature of composer. So no, we cannot disable it while keeping composer functional |
I read the thread and understand that resolving dependencies a fundamentally time consuming process, and I get that. I have a somewhat related question though. In my local environment for my dependencies which are pretty simple, I'm seeing an install time of 1.29s vs. 128.91s in my staging environment. The staging environment's hardware is faster than my local environment, and in both cases I've already cloned down the latest repositories, so it should just be performing checkouts.
And here's staging:
My question is - are there any composer settings that would be significantly affecting the speed of the install in one environment versus another? It almost seems like my local is working off of a "warmed cache" so to speak, and staging is doing a whole lot of checks over the network. |
Okay, I figured out how to get more verbose output using the I guess it would be nice though if composer could avoid a remote fetch if/when the dependencies are already satisfied by the local repository. |
I guess we could make it check if the commit exists before fetching, but |
Right, that's a good point! I ended up implementing something in my deploy script that basically creates a copy of the entire codebase, runs Maybe an option in composer itself to do that kind of thing would be good. |
@kalenjordan I don't think this is something Composer should do
The easiest way to make a deployment atomic is probably to deploy in a sibling folder of the previous deployment, and have the webserver document root be a symlink to the live version. Then, all the deployment is independant and the atomic operation is changing the symlink. This is what Capifony does for instance. |
I see what you mean, and yes that's the deploy strategy that I ended up going with. I'm still pretty new to composer, so probably not in the best position to suggest what's in scope for the project. But I guess I think of it similarly to Git. Git isn't a deploy tool and other things have to happen in conjunction with a git checkout in order to complete a proper deploy. But, a git checkout is still safe to perform in a production environment. If the repository in question is down or if the fetch takes several seconds or minutes, the file system won't ever be in an unstable intermediate state (well maybe for a split second but for all intents and purposes, it's safe to do and widely used). It would be kind of cool if the same were true of composer. Maybe that would be as simple as performing all of the fetches up front and then performing all of the checkouts in quick succession after that, so that composer would essentially be at least as production-friendly as git itself. |
@kalenjordan the difference is that And this is not as simple as that anyway: in case of packages installed from an archive, composer has to delete the existing folder and then install the new zipball. So we cannot make it atomic at the composer level. |
Okay cool, thanks for the explanation. Hopefully at some point soon I'll dig in further into composer and better understand the architecture. |
Similar stuff here: Setting up Symfony2 via the official Symfony2 composer command takes forever in a Ubuntu 12.04 Vagrant machine (no shared folder / NFS issue btw). Weird. Can be reproduced. Btw there's a StackOverflow question about that, having more than 3000 hits: http://stackoverflow.com/questions/13413788/composer-is-very-very-slow UPDATE: Interesting stuff from the comments:
|
By the way my sysadmin ended up finding that there was some kind of block at the network level related to this. Might have been a firewall rule or something, not totally sure. |
I add "require-dev": { to composer.json and i run composer update and it throw this exception: Loading composer repositories with package information
Please help me install it. Thanks |
@mrstormcs try installing git (e.g. |
@asgrim Thank you for your help. |
MY SOLUTION WINDOWS 10 x 64 bits WAMP user with Laravel, you need a thing called cacert.pem
then paste that file in your wamp main directory then in the search tab search for php.ini open all the files with that name on sublime or any text editor find a line called PUT YOUR PATH TO THIS FILE - see the example on my case:
and
do this on all the php.ini files you search before, in all the files! OKAY RESTART SERVER that works for me, hope for you
|
Hi there. I seem to have some issue with the Installing or Updating dependencies steps taking forever. To isolate my issue somewhat, I created a new directory, use the latest composer at the moment (
19bfd6c7
), and put this composer.json, nothing else:First output:
Then:
And finally
Is this expected behaviour to take roughly 60 seconds to determine whether there is updates, even if there are no actual updates? I notice in htop that
php
has very high CPU usage during this time (90-100%)I don't suspect a problem with network connectivity or slowness:
Downloading this JSON from packagist took 0.3s. I just did a speedtest.net check, we have 15ms ping, 73 Mbps down and 32 Mbps up.
The environment I'm running in is Ubuntu 12.04 server, running in a VM on ESX servers, 1 CPU with 2GB RAM.
The text was updated successfully, but these errors were encountered: