-
Notifications
You must be signed in to change notification settings - Fork 2.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(ingestion/powerbi): Ingest datasets not used in PowerBI visualization(tiles/pages) #8212
feat(ingestion/powerbi): Ingest datasets not used in PowerBI visualization(tiles/pages) #8212
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
mostly looking good, just a minor code cleanup thing
) | ||
cur_workspace.scan_result = workspace_metadata | ||
cur_workspace.datasets = self._get_workspace_datasets( | ||
cur_workspace.scan_result | ||
) | ||
for value in cur_workspace.datasets.values(): | ||
logger.info(f"Mohd = {value.name}") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
dataset_workunits = self.mapper.generate_container_for_dataset(dataset) | ||
for workunit in dataset_workunits: | ||
yield workunit | ||
yield from self.extract_datasets_as_containers() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i'd like to keep the self.source_config.extract_datasets_to_containers
check here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
these tests are getting pretty unwieldy - is there a way we can reduce the duplication across all the golden files?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I remove the unwanted mock data. Now this file size got reduced to 2k
… github.com:mohdsiddique/datahub into master+acr-5435-ingest-independent-datasets-powerbi
… github.com:acryldata/datahub-fork into master+acr-5435-ingest-independent-datasets-powerbi
Merging because these changes aren't related to the kafka docker build failure |
…ation(tiles/pages) (datahub-project#8212) Co-authored-by: MohdSiddiqueBagwan <mohdsiddique.bagwan@gslab.com>
No description provided.