Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] AuthByKeyPair.prepare() takes 1 positional argument but 3 were given #1032

Closed
2 tasks done
mmyers5 opened this issue May 10, 2024 · 5 comments
Closed
2 tasks done
Labels
bug Something isn't working triage

Comments

@mmyers5
Copy link

mmyers5 commented May 10, 2024

Is this a new bug in dbt-core?

  • I believe this is a new bug in dbt-core
  • I have searched the existing issues, and I could not find an existing issue for this bug

Current Behavior

We are using docker compose in GitHub Actions to generate dbt docs and publish them to an external storage location. When the CI/CD job runs docker compose run dbt docs generate --target nondev, it randomly fails with the error: AuthByKeyPair.prepare() takes 1 positional argument but 3 were given. Sometimes the CI/CD job succeeds, and I haven't been able to get the failure to trigger locally.

It is worth noting that the following CI/CD job always succeeds with the command: dbt --warn-error debug --target nondev.

Expected Behavior

No flaky job results

Steps To Reproduce

  1. Here are the environment variables that exist:
    • SNOWFLAKE_ACCOUNT
    • SNOWFLAKE_USER
    • SNOWFLAKE_AUTHENTICATOR=snowflake_jwt
    • SNOWFLAKE_RSA_KEY_SECRET
    • SNOWFLAKE_RSA_KEY_PASSWORD
    • SNOWFLAKE_ROLE
    • SNOWFLAKE_DATABASE
    • SNOWFLAKE_WAREHOUSE
    • SNOWFLAKE_TARGET_SCHEMA
  2. This is the profile that we are using:
    nondev:
      type: snowflake
      account: "{{ env_var('SNOWFLAKE_ACCOUNT') }}"
      user: "{{ env_var('SNOWFLAKE_USER') }}"
      authenticator: "{{ env_var('SNOWFLAKE_AUTHENTICATOR', 'snowflake') }}"
      password: "{{ env_var('SNOWFLAKE_PASSWORD', None) }}"
      private_key: "{{ env_var('SNOWFLAKE_RSA_KEY_SECRET', '') }}"
      private_key_passphrase: "{{ env_var('SNOWFLAKE_RSA_KEY_PASSWORD', '') }}"
      role: "{{ env_var('SNOWFLAKE_ROLE') }}"
      database: "{{ env_var('SNOWFLAKE_DATABASE') }}"
      warehouse: "{{ env_var('SNOWFLAKE_WAREHOUSE') }}"
      schema: "{{ env_var('SNOWFLAKE_TARGET_SCHEMA', 'dev') }}"
      threads: 8 # The number of models that can run concurrently in production
      client_session_keep_alive: False
      # optional
      connect_retries: 0 # default 0
      connect_timeout: 10 # default: 10
      retry_on_database_errors: False # default: false 
      retry_all: False  # default: false
  3. Here is the docker image that we are using: ghcr.io/dbt-labs/dbt-snowflake:1.7.3. We have an entrypoint that prefixes any command being passed into the docker container with dbt, so docker compose run dbt docs will run dbt docs.
  4. Upstream job that is being run in CI/CD successfully:
    • docker compose build dbt
    • docker compose run dbt --warn-error parse --target nondev
    • docker compose run dbt --warn-error debug --target nondev
  5. Steps of the job that is failing:
    • docker compose build dbt
    • docker compose run dbt docs generate --target nondev

Relevant log output

On a failed run:

18:41:20  Running with dbt=1.7.11
18:41:21  Registered adapter: snowflake=1.7.3
18:41:21  Unable to do partial parsing because saved manifest not found. Starting full parse.
18:41:32  Found 420 models, 2 analyses, 154 tests, 19 seeds, 351 sources, 33 exposures, 0 metrics, 1035 macros, 0 groups, 0 semantic models
18:41:32  
18:41:43  Encountered an error:
Database Error
  AuthByKeyPair.prepare() takes 1 positional argument but 3 were given

On a successful run:

18:15:02  Running with dbt=1.7.11
18:15:03  Registered adapter: snowflake=1.7.3
18:15:03  Unable to do partial parsing because saved manifest not found. Starting full parse.
18:15:14  Found 420 models, 2 analyses, 154 tests, 19 seeds, 351 sources, 33 exposures, 0 metrics, 1035 macros, 0 groups, 0 semantic models
18:15:14  
18:16:15  Concurrency: 8 threads (target='nondev')
18:16:15  
18:16:39  Building catalog
18:16:55  Catalog written to /usr/app/dbt/target/catalog.json

Environment

- OS: Ubuntu 22.04.4
- Python: 3.10.7
- dbt: 1.7.11

Which database adapter are you using with dbt?

snowflake

Additional Context

No response

@mmyers5 mmyers5 added bug Something isn't working triage labels May 10, 2024
@mmyers5 mmyers5 changed the title [Bug] <title> [Bug] AuthByKeyPair.prepare() takes 1 positional argument but 3 were given May 10, 2024
@jtcohen6
Copy link
Contributor

@mmyers5 Weird, I'm not sure what could be happening here, especially if the error is only intermittent.

Given where the error it being raised, it sounds like it could be a a bug in snowflake-connector-python, or at least a place where they might want some want more defensive code.

At the end of the day, all we're really doing is using profiles.yml to validate and pass arguments into snowflake.connector.connect. From that point, they'd flow into:

So I'd recommend you open a bug report in that repository, too — the team there would know better than me how unexpected positional arguments might be flowing through.

@jtcohen6 jtcohen6 transferred this issue from dbt-labs/dbt-core May 11, 2024
@mmyers5
Copy link
Author

mmyers5 commented May 13, 2024

Sounds good, will do

@mmyers5
Copy link
Author

mmyers5 commented May 14, 2024

Here are --debug logs in case you find them helpful (I also attached it to the related issue here):

17:39:18  On list_my_db_dsg: /* {"app": "dbt", "dbt_version": "1.7.11", "profile_name": "gc_snowflake", "target_name": "nondev", "connection_name": "list_my_db_dsg"} */
show terse objects in my_db.dsg limit 10000
17:39:18  Acquiring new snowflake connection 'list_my_db_partnerships'
17:39:18  Opening a new connection, currently in state init
17:39:18  Using snowflake connection "list_my_db_partnerships"
17:39:18  Using snowflake connection "list_my_db_metadata"
17:39:18  On list_my_db_metadata: /* {"app": "dbt", "dbt_version": "1.7.11", "profile_name": "gc_snowflake", "target_name": "nondev", "connection_name": "list_my_db_metadata"} */
show terse objects in my_db.metadata limit 10000
17:39:18  Acquiring new snowflake connection 'list_my_db_external'
17:39:18  Opening a new connection, currently in state init
17:39:18  Using snowflake connection "list_my_db_external"
17:39:18  On list_my_db_partnerships: /* {"app": "dbt", "dbt_version": "1.7.11", "profile_name": "gc_snowflake", "target_name": "nondev", "connection_name": "list_my_db_partnerships"} */
show terse objects in my_db.partnerships limit 10000
17:39:18  On list_my_db_external: /* {"app": "dbt", "dbt_version": "1.7.11", "profile_name": "gc_snowflake", "target_name": "nondev", "connection_name": "list_my_db_external"} */
show terse objects in my_db.external limit 10000
17:39:18  Opening a new connection, currently in state init
17:39:18  Opening a new connection, currently in state init
17:39:18  Acquiring new snowflake connection 'list_my_db_ltv'
17:39:18  Acquiring new snowflake connection 'list_my_db_gc'
17:39:19  Using snowflake connection "list_my_db_ltv"
17:39:19  Using snowflake connection "list_my_db_gc"
17:39:19  On list_my_db_ltv: /* {"app": "dbt", "dbt_version": "1.7.11", "profile_name": "gc_snowflake", "target_name": "nondev", "connection_name": "list_my_db_ltv"} */
show terse objects in my_db.ltv limit 10000
17:39:19  Opening a new connection, currently in state init
17:39:19  On list_my_db_gc: /* {"app": "dbt", "dbt_version": "1.7.11", "profile_name": "gc_snowflake", "target_name": "nondev", "connection_name": "list_my_db_gc"} */
show terse objects in my_db.gc limit 10000
17:39:19  Opening a new connection, currently in state init
17:39:19  Snowflake adapter: Error running SQL: /* {"app": "dbt", "dbt_version": "1.7.11", "profile_name": "gc_snowflake", "target_name": "nondev", "connection_name": "list_my_db_metadata"} */
show terse objects in my_db.metadata limit 10000
17:39:19  Snowflake adapter: Rolling back transaction.
17:39:19  Snowflake adapter: Error running SQL: macro list_relations_without_caching
17:39:19  Snowflake adapter: Rolling back transaction.
17:39:19  On list_my_db_metadata: No close available on handle
17:39:19  Re-using an available connection from the pool (formerly list_my_db_metadata, now list_my_db_base)
17:39:19  Using snowflake connection "list_my_db_base"
17:39:19  On list_my_db_base: /* {"app": "dbt", "dbt_version": "1.7.11", "profile_name": "gc_snowflake", "target_name": "nondev", "connection_name": "list_my_db_base"} */
show terse objects in my_db.base limit 10000
17:39:19  Opening a new connection, currently in state closed
17:39:19  SQL status: SUCCESS 19 in 1.0 seconds
17:39:19  On list_my_db_external: Close
17:39:19  SQL status: SUCCESS 30 in 1.0 seconds
17:39:19  On list_my_db_partnerships: Close
17:39:19  SQL status: SUCCESS 71 in 1.0 seconds
17:39:19  On list_my_db_intermediate: Close
17:39:19  SQL status: SUCCESS 7 in 1.0 seconds
17:39:19  On list_my_db_ltv: Close
17:39:19  SQL status: SUCCESS 153 in 1.0 seconds
17:39:19  On list_my_db_raw_events: Close
17:39:19  Re-using an available connection from the pool (formerly list_my_db_external, now list_my_db_tm)
17:39:19  Using snowflake connection "list_my_db_tm"
17:39:19  On list_my_db_tm: /* {"app": "dbt", "dbt_version": "1.7.11", "profile_name": "gc_snowflake", "target_name": "nondev", "connection_name": "list_my_db_tm"} */
show terse objects in my_db.tm limit 10000
17:39:19  Opening a new connection, currently in state closed
17:39:20  SQL status: SUCCESS 106 in 1.0 seconds
17:39:20  On list_my_db_gc: Close
17:39:20  SQL status: SUCCESS 129 in 0.0 seconds
17:39:20  On list_my_db_base: Close
17:39:20  SQL status: SUCCESS 44 in 0.0 seconds
17:39:20  On list_my_db_tm: Close
17:39:20  SQL status: SUCCESS 5 in 2.0 seconds
17:39:20  On list_my_db_dsg: Close
17:39:20  Connection 'master' was properly closed.
17:39:20  Connection 'list_my_db_raw_events' was properly closed.
17:39:20  Connection 'list_my_db_intermediate' was properly closed.
17:39:20  Connection 'list_my_db_dsg' was properly closed.
17:39:20  Connection 'list_my_db_partnerships' was properly closed.
17:39:20  Connection 'list_my_db_base' was properly closed.
17:39:20  Connection 'list_my_db_tm' was properly closed.
17:39:20  Connection 'list_my_db_ltv' was properly closed.
17:39:20  Connection 'list_my_db_gc' was properly closed.
17:39:20  Encountered an error:
Database Error
  AuthByKeyPair.prepare() takes 1 positional argument but 3 were given
17:39:20  Resource report: {"command_name": "generate", "command_wall_clock_time": 13.463965, "process_user_time": 14.95133, "process_kernel_time": 0.287372, "process_mem_max_rss": "196324", "process_out_blocks": "13528", "command_success": false, "process_in_blocks": "0"}
17:39:20  Command `dbt docs generate` failed at 17:39:20.917144 after 13.46 seconds
17:39:20  Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'end', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7ff218d9b340>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7ff210b64d00>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7ff2109a2830>]}
17:39:20  Flushing usage events

@jtcohen6
Copy link
Contributor

Thanks @mmyers5! Weird that this happens in the middle of a run, where dbt has already managed to make several successful connections. Let's wait and see what the Snowflake team says.

@mmyers5
Copy link
Author

mmyers5 commented May 20, 2024

We found the issue! In our Docker image that containerized the dbt CLI, we were installing our own version of snowflake-connector-python. Not sure how that led to flaky behavior within the same container but 🤷‍♀️ resolving this seems to have fixed it. Thanks y'all

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage
Projects
None yet
Development

No branches or pull requests

2 participants