New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Don't require IDF_TOOLS_PATH location to be writable when idf.py is executed (IDFGH-7791) #9329
Comments
I made IDF_TOOLS_PATH writable to test if once this issue is solved it will work. Sadly, after fixing this it will still cause troubles, it requires a python virtual environment to be created even when it's not needed because deps are already installed system-wide. It fails with: I don't know if I should create a new issue related to this |
I think this part of the behavior is expected, Since IDF depends on quite a few python packages (and specific versions of them, as well), installing them into the global system environment is not realistic — not without potentially causing compatibility issues with other software in the system. Even if there are no conflicts right now, they might occur in the future. We would like to avoid having bug reports opened against ESP-IDF which could be avoided by having an isolated Python environment. Could you please explain why you would like to avoid having a virtual environment (even the one inside IDF_TOOLS_PATH)? As far as I know there is no significant overhead in that, compared to global installation. |
The tools that I used to develop our software with 4.x versions required the tools to be installed system-wide. This allowed different IDEs to be well integrated with IDF by default. With version 5.0 the previous behavior is broken and so is all the tools and the workflow in general. Moreover I think that, in Linux, the standard way is to leave the package manager in control of the dependencies of the software. The decision of using or not a virtual environment, in my opinion, should be made by the user and the software should not block the installation of a package in the standard way. I understand the issue and that the recommended approach may be to use a virtual environment, I'm totally happy with that, but I still want to preserve the ability to install it from the distribution package manager. I also may want to use the dependencies from the package manager (possibly with security patches or other patches) rather than the pip packages (in this case I understand that it will make sense for you to not give support to issues that didn't used the virtual environment approach) |
I think you can probably add the virtual environment
Thanks for the explanation, I understand your argument. I am not sure I agree with it. (i.e. i don't consider
The problem with that is that the issue will still initially get reported as a bug, and we will need to spend time to investigate it before we find that the root cause. We don't often get complete information about the environment in the issue report, even for the things that are mentioned in the issue template. |
I strongly believe that yes, distro package manager are the standard way. If language package managers are mixed with distros package manager there are all sorts of security issues, compatibility issues and so. In short I think that distro package managers are the standard way for users to install software (e.g: to install firefox I use {apt,yum,whatever} not a language specific package manager). Language package managers are useful for developers. For instance if I were a IDF developer I will use a virtual environment for developing IDF and then will make it easier for maintainers to package my software for the different distributions, that is, indeed, how almost all linux software works. I think that to make it work it won't be enought to add bin directory to PATH as the other directories where the libraries are present should be also on the PATHs. Even if it works, I may get exposed to vulnerabilities. If my distribution release a security fix for python-socketio for example and I am using the one inside the virtual environment I may think that I'm safe when the reality is that I'm not because a out-of-date version is running inside the virtual environment. Which means I need to keep track manually of the issues that may be released for every package that is not tracked by my distribution package manager. Is not a very nice solution from a user perspective. |
Can you confirm that the above commit results in idf.py not trying to download I don't see why a constraints file should be downloaded and/or checked when doing For people building on a local machine these things are fine but this is not desirable behavior on a CI machine for those of us not using a docker-like workflow. The toolchain should be installed once and stay the same forever, and updated manually if needed, including python libraries. Maybe this is a matter of opinion but I don't like the idea of pulling from external sources as part of the build procedure every time. Not that I am auditing these packages but seems like a security risk. |
@someburner no, this commit doesn't address downloading and checking the constraint file. It only allows having a read-only tools directory. We are going to change the behavior of export.sh to not fetch the new constraints file if IDF commit hasn't changed, and the install script has been run for this commit successfully once. |
@igrr that sounds good to me, thanks for the info! |
@igrr This bug still seems be present in v5.0 branch. Using the
Probably this should be backported from master to v5.0 branch. |
Environment
master
branch, but probably applies to release/v4.x as wellProblem Description
As found in #9328 (comment), if the tools have been installed by one user to a system-wide location, "regular" user can't successfully build the project using those tools.
Expected Behavior
Build succeeds. If some features (like idf_env.json) aren't available because IDF_TOOLS_PATH is not writeable, they are skipped.
Actual Behavior
Build fails because IDF_TOOLS_PATH is not writeable.
Steps to reproduce
See the discussion in the linked issue.
The text was updated successfully, but these errors were encountered: