Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dashboard export/import doesn't work in v1.3 #16434

Closed
2 of 3 tasks
jayakrishnankk opened this issue Aug 25, 2021 · 10 comments
Closed
2 of 3 tasks

Dashboard export/import doesn't work in v1.3 #16434

jayakrishnankk opened this issue Aug 25, 2021 · 10 comments
Assignees
Labels
validation:required A committer should validate the issue

Comments

@jayakrishnankk
Copy link
Contributor

jayakrishnankk commented Aug 25, 2021

Getting the following error during export > import of dashboards.

This Session's transaction has been rolled back due to a previous exception during flush. 
To begin a new transaction with this Session, first issue Session.rollback(). 
Original exception was: Dataset [reporting].[public].[lookup] already exists 
(Background on this error at: http://sqlalche.me/e/13/7s2a)

Expected results

Prior 1.3 recent update, we were able to do the following.

Environment setup and workflow:

  1. We have 2 environments [dev, prod]
  2. We do our dashboard development in dev environment
  3. Once dashboard looks good, we deploy the changes to prod by doing following 3 steps
    3.1 Export dashboard from dev
    3.2 Import dashboard to prod
    3.3 Replace the slug for the new dashboard so that the users doesn't need to update their bookmarks

Actual results

Gets the following error : Dataset %s already exists

{"asctime": "2021-08-19 15:00:13,977", "threadName": "Dummy-10563", "request_id": "803a6e3e-0ea2-488a-846b-f6550e4ff4fd", "levelname": "ERROR", "name": "superset.views.core", "lineno": 695, "message": "Dataset [reporting].[public].[lookup] already exists", "exc_info": "Traceback (most recent call last):
  File "/app/superset/views/core.py", line 680, in import_dashboards
    {import_file.filename: import_file.read()}, database_id
  File "/app/superset/dashboards/commands/importers/v0.py", line 353, in run
    import_dashboards(db.session, content, self.database_id)
  File "/app/superset/dashboards/commands/importers/v0.py", line 323, in import_dashboards
    new_dataset_id = import_dataset(table, database_id, import_time=import_time)
  File "/app/superset/datasets/commands/importers/v0.py", line 110, in import_dataset
    database_id,
  File "/app/superset/datasets/commands/importers/v0.py", line 204, in import_datasource
    session.flush()
  File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/scoping.py", line 163, in do
    return getattr(self.registry(), name)(*args, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 2536, in flush
    self._flush(objects)
  File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 2678, in _flush
    transaction.rollback(_capture_exception=True)
  File "/usr/local/lib/python3.7/site-packages/sqlalchemy/util/langhelpers.py", line 70, in __exit__
    with_traceback=exc_tb,
  File "/usr/local/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
    raise exception
  File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 2638, in _flush
    flush_context.execute()
  File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/unitofwork.py", line 422, in execute
    rec.execute(self)
  File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/unitofwork.py", line 589, in execute
    uow,
  File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/persistence.py", line 213, in save_obj
    ) in _organize_states_for_save(base_mapper, states, uowtransaction):
  File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/persistence.py", line 387, in _organize_states_for_save
    mapper.dispatch.before_update(mapper, connection, state)
  File "/usr/local/lib/python3.7/site-packages/sqlalchemy/event/attr.py", line 322, in __call__
    fn(*args, **kw)
  File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/events.py", line 719, in wrap
    fn(*arg, **kw)
  File "/app/superset/connectors/sqla/models.py", line 1654, in before_update
    raise Exception(get_dataset_exist_error_msg(target.full_name))
Exception: Dataset [reporting].[public].[lookup] already exists"}


{"asctime": "2021-08-19 15:00:13,981", "threadName": "Dummy-10563", "request_id": "803a6e3e-0ea2-488a-846b-f6550e4ff4fd", "levelname": "ERROR", "name": "superset.views.base", "lineno": 444, "message": "This Session's transaction has been rolled back due to a previous exception during flush. To begin a new transaction with this Session, first issue Session.rollback(). Original exception was: Dataset [reporting].[public].[lookup] already exists (Background on this error at: http://sqlalche.me/e/13/7s2a)", "exc_info": "Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1950, in full_dispatch_request
    rv = self.dispatch_request()
  File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1936, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "/usr/local/lib/python3.7/site-packages/flask_appbuilder/security/decorators.py", line 109, in wraps
    return f(self, *args, **kwargs)
  File "/app/superset/utils/log.py", line 241, in wrapper
    value = f(*args, **kwargs)
  File "/app/superset/views/core.py", line 707, in import_dashboards
    databases = db.session.query(Database).all()
  File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3373, in all
    return list(self)
  File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3535, in __iter__
    return self._execute_and_instances(context)
  File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3557, in _execute_and_instances
    querycontext, self._connection_from_session, close_with_result=True
  File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3572, in _get_bind_args
    mapper=self._bind_mapper(), clause=querycontext.statement, **kw
  File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3550, in _connection_from_session
    conn = self.session.connection(**kw)
  File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 1141, in connection
    execution_options=execution_options,
  File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 1147, in _connection_for_bind
    engine, execution_options
  File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 409, in _connection_for_bind
    self._assert_active()
  File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 296, in _assert_active
    code="7s2a",
sqlalchemy.exc.InvalidRequestError: This Session's transaction has been rolled back due to a previous exception during flush. To begin a new transaction with this Session, first issue Session.rollback(). Original exception was: Dataset [reporting].[public].[lookup] already exists (Background on this error at: http://sqlalche.me/e/13/7s2a)"}
{"asctime": "2021-08-19 15:00:13,984", "threadName": "Dummy-10563", "request_id": "803a6e3e-0ea2-488a-846b-f6550e4ff4fd", "levelname": "INFO", "name": "maf", "lineno": 61, "message": "response for POST https://reports-uat.storyboard.nielsen.com/superset/import_dashboards/ 500 INTERNAL SERVER ERROR", "method": "POST", "url": "https://reports-uat.storyboard.nielsen.com/superset/import_dashboards/", "status": "500 INTERNAL SERVER ERROR"}

Screenshots

imagen

How to reproduce the bug

  1. set "VERSIONED_EXPORT": True feature flag
  2. Start superset with examples
  3. Export an existing dashboard
  4. Import it back to the same instance
  5. See error

Environment

(please complete the following information):

  • superset version: 1.3
  • python version: 3.7.9
  • node.js version: v14.15.5
  • any feature flags active:

Checklist

Make sure to follow these steps before submitting your issue - thank you!

  • I have checked the superset logs for python stacktraces and included it here as text if there are any.
  • I have reproduced the issue with at least the latest released version of superset.
  • I have checked the issue tracker for the same issue and I haven't found one similar.

Additional context

Tracked down the error to this pull request #15909, and specifically this change

@jayakrishnankk jayakrishnankk added the #bug Bug report label Aug 25, 2021
@junlincc junlincc removed the #bug Bug report label Aug 25, 2021
@junlincc
Copy link
Member

@john-bodley @etr2460 hi! do you mind taking a look? ^ thanks!

@junlincc junlincc added the validation:required A committer should validate the issue label Aug 25, 2021
@etr2460
Copy link
Member

etr2460 commented Aug 26, 2021

I think @betodealmeida knows the most about export/import, we don't really use this feature :/

@junlincc
Copy link
Member

I think @betodealmeida knows the most about export/import, we don't really use this feature :/

gotcha, @jayakrishnankk trouble shoot the issue and tracked down to a potential cause of the regression #15909 , would you mind taking a look? @etr2460

@FedericoCalonge
Copy link

FedericoCalonge commented Sep 1, 2021

Same issue here. I get the same error Exporting and importing any example dashboard in the same Superset environment (v 1.3).

@amitmiran137 amitmiran137 added #bug:blocking! Blocking issues with high priority and removed #bug:blocking! Blocking issues with high priority v1.3 labels Sep 2, 2021
@amitmiran137
Copy link
Member

this is a user mistake by turning off the feature flag and moved to old mechanism:
#16569

@vivek-kandhvar
Copy link

We have the same issue after upgrading superset from 1.2.0 to 1.3.0.
We haven't touched the flag "VERSIONED_EXPORT", and it is still False which is the default.

This used to work in 1.2.0 without any issue. Now we end up with this error "Dataset <xyz> already exists!"

image

@jitendra-dangi
Copy link

We have the same issue after upgrading superset from 1.2.0 to 1.3.0.
We haven't touched the flag "VERSIONED_EXPORT", and it is still False which is the default.

This used to work in 1.2.0 without any issue. Now we end up with this error "Dataset <xyz> already exists!"

image

getting the same error, unable to export and import any of the example dashboards.

@FedericoCalonge
Copy link

Same issue here.
I get the same error Exporting and importing any example dashboard in the same Superset environment (v 1.3).
I have the VERSIONED_EXPORT flag in False.

@tomasfarias
Copy link

We test new versions of Superset by importing dashboards from our production Superset (currently in version 1.0) to our testing Superset (currently in version 1.3.2).

We were seeing the issues described here when importing dashboards into the testing Superset and just wanted to add that setting the VERSIONED_EXPORT flag to True in our testing Superset fixed our issue and now we can export and import dashboards.

Just adding this since this appears to be one of the first results when googling around, and having to turn on a feature flag may not be so intuitive. At least until #16569 is merged and the flag is turned on by default.

@Balamuralia
Copy link

We are trying to Import dashboard from Development Environment to Production environment. It keep throwing the below error.

VERSIONED_EXPORT flag is true in both Development environment and Production Environment. We are using Apache Superset 1.3.1. Below is the error log.

Error running import command
2022-03-20 16:30:16 default[1-3-1qa] Traceback (most recent call last): File "/workspace/superset/commands/importers/v1/init.py", line 67, in run self._import(db.session, self._configs, self.overwrite) File "/workspace/superset/dashboards/commands/importers/v1/init.py", line 126, in _import config = update_id_refs(config, chart_ids, dataset_info) File "/workspace/superset/dashboards/commands/importers/v1/utils.py", line 85, in update_id_refs for old_id, columns in metadata["filter_scopes"].items() File "/workspace/superset/dashboards/commands/importers/v1/utils.py", line 85, in for old_id, columns in metadata["filter_scopes"].items() KeyError: 580

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
validation:required A committer should validate the issue
Projects
None yet
Development

No branches or pull requests

10 participants