Skip to content

Get all the memory that a Semantic Model uses not just the memory reported by the SemPy's model_memory_analyzer #772

@tarente

Description

@tarente

Is your feature request related to a problem? Please describe.
The model_memory_analyzer only reports some of the memory currently used by a Semantic Model.

Describe the solution you'd like
A way to report all the memory currently used by a Semantic Model that is considered for the Max Memory (see Semantic model SKU limitation.

Describe alternatives you've considered
Sometime, Clear the Semantic Model's cache solves the issues.

Additional context
Add any other context or screenshots about the feature request here.
In a Semantic Model in Direct Lake (on SQL endpoint) dynamically loads the tables, columns, etc. as needed to answer the users' queries. This amount of memory is reported by the model_memory_analyzer. However, there are other types of memory consumed by the queries sent to the Semantic Model.

For a Semantic Model in Direct Lake, the refresh (automatic and on-demand) may fail to execute if the total memory consumed exceeds the max memory for the SKU.
As described above, clearing the cache may solve the issue and can be used when the Direct Lake refresh is on-demand. However, the automatic Direct Lake refresh error due to Out of Memory will only be visible after a consecutive refresh failure where the Customer will receive an email. During that time the Semantic Model will still be showing outdated data.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions