Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Setting ARTIFACTS_KEYRING_NONINTERACTIVE_MODE=true results in pip not being able to find my package #18

Closed
philMarius opened this issue Apr 21, 2020 · 19 comments

Comments

@philMarius
Copy link

This does however work for twine uploading. Have set a personal access token for myself and set TWINE_USERNAME, TWINE_PASSWORD and TWINE_REPOSITORY_URL. Packages can be successfully uploaded (package is already uploaded hence the error):

$ twine upload dist/*
Uploading distributions to https://pkgs.dev.azure.com/<org>/test/_packaging/<feed>/pypi/upload
Uploading <name>-0.2.0.dev1-py3-none-any.whl
100%|████████████████████████████████████████████████████████████████████████| 59.6k/59.6k [00:00<00:00, 73.7kB/s]
NOTE: Try --verbose to see response content.
HTTPError: 409 Client Error: Conflict - The feed '<feed>' already contains file '<name>-0.2.0.dev1-py3-none-any.whl' in package '<name> 0.2.0.dev1'. (DevOps Activity ID: ***-***-***-***-***) for url: https://pkgs.dev.azure.com/<org>/test/_packaging/<feed>/pypi/upload

However, I want to install the same package using pip non-interactively and have set the environment variable ARTIFACTS_KEYRING_NONINTERACTIVE_MODE to true. This fails the pip installation as can be seen below:

$ ARTIFACTS_KEYRING_NONINTERACTIVE_MODE=true
$ export ARTIFACTS_KEYRING_NONINTERACTIVE_MODE
$ printenv ARTIFACTS_KEYRING_NONINTERACTIVE_MODE
true
$ pip install --extra-index-url https://pkgs.dev.azure.com/<org>/test/_packaging/<feed>/pypi/simple/
 --no-cache-dir <name>==0.2.0.dev1
Looking in indexes: https://pypi.org/simple, https://pkgs.dev.azure.com/<org>/test/_packaging/<feed>
/pypi/simple/
ERROR: Could not find a version that satisfies the requirement <name>==0.2.0.dev1 (mfrom versions: none)
ERROR: No matching distribution found for <name>==0.2.0.dev1

Setting it to false outputs the prompt to open the browser and give permission:

$ ARTIFACTS_KEYRING_NONINTERACTIVE_MODE=false
$ export ARTIFACTS_KEYRING_NONINTERACTIVE_MODE
$ printenv ARTIFACTS_KEYRING_NONINTERACTIVE_MODE
false
$ pip install --extra-index-url https://pkgs.dev.azure.com/<org>/test/_packaging/<feed>/pypi/simple/
 --no-cache-dir <name>==0.2.0.dev1
Looking in indexes: https://pypi.org/simple, https://pkgs.dev.azure.com/<org>/test/_packaging/<feed>/pypi/simple/
[Minimal] [CredentialProvider]DeviceFlow: https://pkgs.dev.azure.com/<org>/test/_packaging/<feed>/pypi/simple/
[Minimal] [CredentialProvider]ATTENTION: User interaction required. 

    **********************************************************************

    To sign in, use a web browser to open the page https://microsoft.com/devicelogin and enter the code *** to authenticate.

    **********************************************************************

Any help would be much appreciated!

Notes:

  • I would specifically like to use pip and possibly env vars to solve this issue
  • I am running Ubuntu 19.10
@philMarius
Copy link
Author

Had another crack at this and managed to get an okay solution, if anyone has any tips on improving this I'm all ears.

My error lay in using --extra-index-url instead of the --index-url for pip. Using pip's basic authentication ability, you can use a personal access token in place of username and, by setting ARTIFACTS_KEYRING_NONINTERACTIVE_MODE to true in the environment variables, you can install packages non-interactively.

Example:

pip install -i https://<token>@pkgs.dev.azure.com/<org>/<project>/_packaging/<feed>/pypi/simple/ --no-cache-dir <package>

@dataders
Copy link

@philMarius I'm glad you were able to get something working. However, as soon as you include the token in your index URL, you aren't using the keyring package anymore...

I want to be able to take advantage of artifiacts-keyring while using a requirements.txt with the index url included. In this case, it doesn't make sense to include the token, because then the key would be hard-coded.

also, isn't it an error in pip/keyring that --extra-index-url doesn't behave like it should?

@philMarius
Copy link
Author

Well as far as I know (excuse my ignorance, still a junior) this does use the artifacts-keyring package because if you don't set ARTIFACTS_KEYRING_NONINTERACTIVE_MODE to true it still opens the interactive prompt. I do agree though that the hard coded token is not the best solution, am open to better ideas. Will reopen this issue

@philMarius philMarius reopened this May 20, 2020
@dparkar
Copy link

dparkar commented Jun 4, 2020

--extra-index-url worked fine for me. I am doing :

pip install --upgrade --extra-index-url https://<PAT>@pkgs.dev.azure.com/<org>/<project>/_packaging/<feed>/pypi/simple/ <pkg>

I have also installed keyring and artifacts-keyring and set ARTIFACTS_KEYRING_NONINTERACTIVE_MODE to true.

All this in a dockerfile.

@dataders
Copy link

dataders commented Jun 4, 2020

@dparkar I'd love to be wrong here, but if you include a PAT token in your index URL, you don't even need artifacts-keyring! It's the equivalent to bringing a lockpick to get into door to which you already have a key.

Here's an example

> conda create -n do_i_need_PAT python=3.6 -y
> conda activate do_i_need_PAT 
> pip install <pkg> --extra-index-url https://<PAT>@pkgs.dev.azure.com/<org>/<project>/_packaging/<feed>/pypi/simple/
Successfully installed <pkg>

To reiterate our teams requirement. We would like to have our Azure Artifact-hosted python package listed in our source-controlled requirements.txt and be able to provision a Docker container that creates our desired python environment without hard-coding an ADO auth token.

@philMarius
Copy link
Author

philMarius commented Jun 7, 2020

To reiterate our teams requirement. We would like to have our Azure Artifact-hosted python package listed in our source-controlled requirements.txt and be able to provision a Docker container that creates our desired python environment without hard-coding an ADO auth token.

This is essentially what we would like too if possible, the PAT "works" but is not ideal.

@dataders
Copy link

dataders commented Jun 7, 2020

@philMarius just opened a priority support request, 120060724001259. Let's hope we can get an answer!

@johnterickson
Copy link
Contributor

@swanderz Could you elaborate on how you expect the auth to work? What/where is the secret the you expect the credential provider to leverage?

ARTIFACTS_KEYRING_NONINTERACTIVE_MODE just means "if the auth requires interactivity (e.g. 2FA) then fail instead of prompting".

@dataders
Copy link

dataders commented Jun 18, 2020

@johnterickson I appreciate you reaching out and asking for clarification. I'm a definitely a newb in this space.

We are using Azure Machine Learning's AMLCompute. Which let's you define Conda/pip environments on the fly.

My ask is is a way to automate landing the ADO PAT into the Docker container's system credential store / keyring and have it be made available non-interactively to pip when installing with an ADO artifacts feed as an extra-index-url. Does that make sense? We currently hard-code our feed URL with the PAT and we don't want to do that.

@rastala and the AML team are also looking at an alternative way to make this happen.

@johnterickson
Copy link
Contributor

I added a note to #10 - which looks like this is the same.

@philMarius
Copy link
Author

Am similarly new to this space so excuse my ignorance. We have a different setup where we install our library from Artifacts on Databricks and use it from there. What we would love to see is being able to install the library without specifying the PAT in the URL at all and, instead, potentially use something like an environment variable. Also, we'd prefer to move away from PATs altogether and utilise something akin to service tokens if that's possible?

@dataders
Copy link

dataders commented Jun 22, 2020

Also, we'd prefer to move away from PATs altogether and utilise something akin to service tokens if that's possible?

@philMarius -- agreed!
@johnterickson, is their support for authenticating to DevOps using a a managed identity?

@johnterickson
Copy link
Contributor

This is just a wrapper around https://github.com/microsoft/artifacts-credprovider so you can pass it the secret via VSS_NUGET_EXTERNAL_FEED_ENDPOINTS (yes it says NuGet, but don't worry about that 😊 ) see https://github.com/microsoft/artifacts-credprovider#environment-variables

You may also be interested in NUGET_CREDENTIALPROVIDER_SESSIONTOKENCACHE_ENABLED

Some more info on tokens:
https://docs.microsoft.com/en-us/azure/devops/pipelines/process/access-tokens?view=azure-devops&tabs=yaml
https://docs.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml#systemaccesstoken

@philMarius
Copy link
Author

This is just a wrapper around microsoft/artifacts-credprovider so you can pass it the secret via VSS_NUGET_EXTERNAL_FEED_ENDPOINTS (yes it says NuGet, but don't worry about that ) see microsoft/artifacts-credprovider#environment-variables

Thanks for this! As far as I can tell though it still requires PATs?

@dataders
Copy link

@philMarius, maybe the gist of what @johnterickson is suggesting that you can authenticate on one machine, the rip the resulting cached session token and put it on another machine (or Docker container)?

@philMarius
Copy link
Author

Hmmm that's not really a viable solution unfortunately due to the reliance on PATs still. We may stick with the URL insertion until more robust authentication methods are available for the time being.

@johnterickson
Copy link
Contributor

@philMarius I don’t follow what you mean by “reliance on PAT”. At the end of the day, either you need to have a Public Feed (anonymously accessible by the whole world) or you need to have some sort of secret (e.g. a PAT, certificate with private key, etc). Then, you need a way to pass the secret into the container.

Is your concern with the secret or how to pass the secret?

@philMarius
Copy link
Author

My concern with PATs is that they're tied to a specific user, other devs can't manage access with keys not tied to them. Plus, we want to automate a few of our jobs and pull libraries programmatically which will require passing keys around and I'd prefer them not to be tied to specific users

@troll-os
Copy link

I'm having the same concern today. PAT are not long term solutions as they are tied to users. If I leave my org all the pipes built around them will fail.

It would be awesome if we could authenticate using Service Principals on Feed but it seems there is no way to this as of today...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants