Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[CoE Starter Kit - Suggestion] AUDIT LOGS USED 80% ! Your flow is consuming too much data #8334

Closed
1 task done
ClaudioRWS opened this issue May 22, 2024 · 6 comments
Closed
1 task done
Assignees
Labels
coe-starter-kit CoE Starter Kit issues question Further information is requested

Comments

@ClaudioRWS
Copy link

ClaudioRWS commented May 22, 2024

Does this question already exist in our backlog?

  • I have checked and confirm this is a new question.

What is your question?

Hey, guys,

Today I came across a somewhat unusual message in my environment

image

To alleviate the problem, change the flow frequency from 1 to 2 hours, I believe this difference will help with flow, and administrators will not have that much of an impact.

What solution are you experiencing the issue with?

Core

What solution version are you using?

4.31

What app or flow are you having the issue with?

Admin | Audit Logs | Sync Audit Logs (V2)

What method are you using to get inventory and telemetry?

Cloud flows

AB#3341

@ClaudioRWS ClaudioRWS added coe-starter-kit CoE Starter Kit issues question Further information is requested labels May 22, 2024
@Jenefer-Monroe
Copy link
Collaborator

You'll need to also change this or you will miss every other hour of telemetry. Which will likely get you back to your 80 unfortunately.
image

We do have a potential solution in the worked to help here thanks to a new connector but cannot promise anything yet.

@Jenefer-Monroe Jenefer-Monroe self-assigned this May 22, 2024
@ClaudioRWS
Copy link
Author

Thanks for your help @Jenefer-Monroe , I've already made the change,

As it is a simple solution, there is no problem waiting as long as necessary.

image

@petepuu
Copy link

petepuu commented May 23, 2024

@ClaudioRWS Interested to know will that change have an effect as the limit is related to data read from / write to flow history. I assume that means how much flow processed data totally by each action i.e. what we can see in flow instance in flow run history in actions all JSON data etc. Probably that does not affect the amount of data flow processes as it still need to handle same amount of Audit Log content slots and events. Small benefit comes from the local flow variable and environment variable related actions.

Would be great to see what is the current content consumption per flow but I guess that info is not surfaced anywhere. Do you still have that warning message active there?

The following table describes the content throughput limits, which refer to the amount of data that's read from or written to the run history of the cloud flow.

https://learn.microsoft.com/en-us/power-automate/limits-and-config#content-throughput-limits

@Jenefer-Monroe
Copy link
Collaborator

I don't think this limit is about the number of runs but the amount of data process operations.

The new solution is a new API Purview Audit Search Graph API which allows us to filter to just Power Apps from the backend. So we wont need to be loading and filtering out every Excel boot and dataverse operation anymore. Which will substantially decrease the amount of data processing done in the flow.

We are waiting for it to GA before we ship with it as an option so hopefully this summer!

@petepuu
Copy link

petepuu commented May 23, 2024

@Jenefer-Monroe yes action (Power Platform requests) related limit notifications are a bit different.

Hopefully Audit Log Graph API will be GA in June 🙂
https://www.microsoft.com/en-us/microsoft-365/roadmap?featureid=117587

Seems that we have this now in roadmap also for GCC and DoD tenants
https://www.microsoft.com/en-us/microsoft-365/roadmap?featureid=167531

@ClaudioRWS
Copy link
Author

ClaudioRWS commented May 27, 2024

@ClaudioRWSInteressado em saber se essa mudança terá efeito, pois o limite está relacionado à leitura/gravação de dados no histórico de fluxo. Presumo que isso signifique quanto fluxo de dados processou totalmente por cada ação, ou seja, o que podemos ver na instância de fluxo no histórico de execução de fluxo em ações, todos os dados JSON, etc. Provavelmente isso não afeta a quantidade de processos de fluxo de dados, pois ainda precisa lidar com os mesmos quantidade de slots e eventos de conteúdo do log de auditoria. Um pequeno benefício vem da variável de fluxo local e das ações relacionadas à variável de ambiente.

Seria ótimo ver qual é o consumo atual de conteúdo por fluxo, mas acho que essa informação não aparece em lugar nenhum. Você ainda tem aquela mensagem de aviso ativa aí?

A tabela a seguir descreve os limites de produção de conteúdo, que se referem à quantidade de dados lidos ou gravados no histórico de execução do fluxo de nuvem.

https://learn.microsoft.com/en-us/power-automate/limits-and-config#content-throughput-limits

@ClaudioRWS Interested to know will that change have an effect as the limit is related to data read from / write to flow history. I assume that means how much flow processed data totally by each action i.e. what we can see in flow instance in flow run history in actions all JSON data etc. Probably that does not affect the amount of data flow processes as it still need to handle same amount of Audit Log content slots and events. Small benefit comes from the local flow variable and environment variable related actions.

Would be great to see what is the current content consumption per flow but I guess that info is not surfaced anywhere. Do you still have that warning message active there?

The following table describes the content throughput limits, which refer to the amount of data that's read from or written to the run history of the cloud flow.

https://learn.microsoft.com/en-us/power-automate/limits-and-config#content-throughput-limits

image

@petepuu and @Jenefer-Monroe Hello!!!!!
tks, problem solved!

Yes, the solution worked, I no longer received the message

Even though it was a flow with many executions throughout the day, it solved the problem

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
coe-starter-kit CoE Starter Kit issues question Further information is requested
Projects
Status: Done
Development

No branches or pull requests

3 participants