Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running cell for Analyze in Notebook fails when Knox endpoint in BDC has a different password #5058

Closed
corivera opened this issue Apr 16, 2019 · 9 comments

Comments

@corivera
Copy link
Member

  • Azure Data Studio Version: master

Steps to Reproduce:

  1. Create a SQL Big Data Cluster that uses different passwords for the SQL and Knox endpoints.
  2. Run Analyze in Notebook on a csv in the cluster's HDFS.
  3. Run the generated cell. It throws the following 401 Unauthorized error.

Error:
The code failed because of a fatal error:
Invalid status code '401' from https:// X.X.X.X:30443/gateway/default/livy/v1/sessions with error payload: .

@adsbot
Copy link

adsbot bot commented Apr 16, 2019

Thanks for submitting this issue. Please also check if it is already covered by an existing one, like:

@alanrenmsft
Copy link
Contributor

@kevcunnane , IIUC, this is by design, the passwords has to match. right?

@kevcunnane
Copy link
Contributor

@alanrenmsft somewhat correct. We know this is a potential issue, so should handle it. In HDFS expansion scenario we prompt for password if authentication fails. That's the correct solution here, so let's keep it open.

@alanrenmsft alanrenmsft added this to the Planning milestone Apr 17, 2019
@chlafreniere chlafreniere self-assigned this Apr 19, 2019
@kevcunnane kevcunnane modified the milestones: May 2019 Release, Planning May 1, 2019
@kevcunnane
Copy link
Contributor

Moving into June. On connecting with Spark kernels, we should be able to verify password (by pinging HDFS) and error / show connection dialog if fails.

@YurongHe this would be a great one to pick up this week if you can.

@YurongHe
Copy link
Contributor

Relate to #5753

@Guillaume-Fourrat
Copy link

Hi, confirming this is still active using ADS

Version: 1.10.0-insider (user setup)
Commit: 7f6839d38e24ebd637f2b7ae3b7a5995421c44b2
Date: 2019-08-18T14:37:30.357Z

Hitting following msg for any spark interactive payload as soon as SAPWD != KNOXPWD:


The code failed because of a fatal error:
	Invalid status code '401' from https://10.1.0.5:30443/gateway/default/livy/v1/sessions with error payload: .

Some things to try:
a) Make sure Spark has enough available resources for Jupyter to create a Spark context.
b) Contact your Jupyter administrator to make sure the Spark magics library is configured correctly.
c) Restart the kernel.

Could you confirm the following ?

@kevcunnane , IIUC, this is by design, the passwords has to match. right?

The "somewhat correct" answer from @kevcunnane is a bit confusing =)

As it stands, this conflicts with ongoing Doc as follows

  1. No mention of requirement of aligning SA PWD with KNOX PWD
  2. Worse, the "SA PWD discoverable and to be changed after deployment" is likely to lead to a different password set in most situations.

So not sure if we have a doc bug where we need to mention that the pwd must be aligned, or a ADS bug where there's some invalid reuse of credentials ?

@chlafreniere
Copy link
Contributor

@kevcunnane does this still repro?

@kevcunnane
Copy link
Contributor

@chlafreniere yes it does.

@Charles-Gagnon
Copy link
Contributor

BDC support in ADS is being deprecated so closing all related issues.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants