-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
integrate function does not return what I expected (maybe newbie misunderstanding?) #701
Comments
Doing some more tests I'm still not much further. I noticed that I made a mistake with the VM Stat Panel – the 6.135 MWh was shown as the MEAN, wheras for Influx I had selected LAST(NOT NULL). But even changing that doesn't help. But to help with understanding my problem a bit better I created this comparison, showing the metric I'm integrating over as well as the integral. I think it's immediately obvious that the Integral can't be correct – it's not possible to reach ~20000000 Wh with an average of maybe 2000 oder 3000 over 6 hours… Surely I must be overlooking something here? |
Hi @bjoernbg! Thanks for report! Marking as bug for further prioritizing. |
@bjoernbg , |
@valyala Thank you so much. The way InfluxDB handles the integral() function threw me off, with your adjustment it works: However… It proved a bit difficult to me to get one result for the whole selected time range ("Energy Produced"). With InfluxDB this is quite simple by using an aggregate function. For VictoriaMetrics I had to set "Minimum Step" to |
That's interesting observation that needs additional investigation. I'll look into it.
Try something like Note also that the query from the last screenshot - |
…ilar to calculations from InfluxDB Updates #701
@bjoernbg , could you build VictoriaMetrics from the commit e6da63d and verify whether it improves accuracy for Take a look also at
Note that square brackets inside |
…ilar to calculations from InfluxDB Updates #701
@valyala I'm on the road on a business trip currently but appreciate your feedback and efforts a lot and will try to have a go at your suggestions on the weekend (als the custom build). |
FYI, the commit mentioned above has been included in v1.41.0. |
In my tests this produced results that didn't have anything to do with what I'm looking for. When I divide the result bei number of hours in my time range, the result becomes more realistic but still almost 100 kWh off:
Nicely spotted, but this actually has to do with my specific setup: I have two inverters that log their data to the database, which I can access like this:
I updated my VictoriaMetrics build (using the 1.41.0 docker image) but didn't see any changes in Grafana.
When I omit the |
There is no need to wrap |
Just to add my 2 cents here: |
Closing this issue as resolved. |
Hi there,
I'm thinking about switching from InfluxDB to VictoriaMetrics in hope for better performance and reliability (I'm looking at storing a moderate number of metrics at minute resolution). As a first step to evaluate I exported my data from Influx into Influx line protocol and imported this into VM. The import of 300 MB of data was very fast so things were looking great. So I started fiddling around with queries in Grafana and am still a bit dumbfounded by PromQL/MetricsQL. InfluxQL with its SQL-like syntax for sure was easier accessible ;) But not to worry, I'm sure I'll get the hang of it.
However, as a first step I want to reproduce one of my most important calculations from Influx – an integral. I'm scraping data from my solar panel electricity production and want to calculate the area under the curve for a certain time frame (typical the selected time window in Grafana). However, the
integrate(…)
function returns vastly different results to InfluxQL'sintegral()
function – much higher. There is also no documentation in its usage, so I'm not sure I'm doing everything correctly. The result of the integral in the InfluxQL example is definitely correct, I don't know what VictoriaMetrics is calculating there…Btw.: VictoriaMetrics' data folder is 10% of the InfluxDB's data folder for the data I tested with (5 MB to 50-60 MB!), so great work!
The text was updated successfully, but these errors were encountered: