Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[PF 01] Track metrics #421

Closed
ToasterTheBrave opened this issue Jul 5, 2023 · 1 comment
Closed

[PF 01] Track metrics #421

ToasterTheBrave opened this issue Jul 5, 2023 · 1 comment
Assignees
Labels

Comments

@ToasterTheBrave
Copy link

ToasterTheBrave commented Jul 5, 2023

Track pigeon performance for use in scheduling weights

We want to assign jobs to pigeons based on several weighted features. The groundwork for this should already be done, but we need to track the features so we can use them.

  • The pigeon's message uptake rate
  • The pigeon's message delivery success rate
  • The pigeon's message delivery speed
  • Active MEV Endpoint * (Mev endpoint setting)
  • Active tx type: 2 set* (eip 1559 for lower gas)

More info can be found in the epic: More info can be found in the Epic: https://www.notion.so/volumefi/EPIC-09-26-22-Paloma-Gas-Management-78319280833e4c7c8ffad368eddb1d91


⏰ Estimate

8 days

❓ Open questions

  • What's the best place to store the metrics? Directly in scheduler or new module? I need to take a fresh look at the code that's already there.
@taariq
Copy link
Member

taariq commented Dec 5, 2023

Added TWO new weights:

  • Active MEV Endpoint * (Mev endpoint setting)
  • Active tx type: 2 set* (eip 1559 for lower gas)

@byte-bandit byte-bandit changed the title Track pigeon performance for use in scheduling weights [PF 01] Track metrics Jan 5, 2024
byte-bandit added a commit to palomachain/paloma that referenced this issue Feb 2, 2024
# Related Github tickets

- VolumeFi#421

# Background

The `metrix` module is responsible for capturing all kinds of domain
events
around relayer performance, aggregating them in a meaningful way and
making them accessible to consumers.

## The collected metrics

The following metrics are currently actively being gathered:

### Relayer Fee

> [!WARNING]  
> This is still very WIP, as the concept of user fees is yet to be
embossed into Paloma.

### Uptime

The update is a percentile value of representing the network uptime of
the valid
ator node attributed with the relayer.

It is updated once per block, and calculated by evaluating the nodes
`missed
blocks` during the `signed blocks window` using the following formula:

`((params.signed_blocks_window - signingInfo.missed_blocks_counter) /
params.signed_blocks_window) * 100`

> [!WARNING]  
> This still needs to be adapted for jailed Validators, who will still
be reported
> with an uptime of 100% (although not eligible for relaying).

### SuccessRate

This weight represents how important pigeon relaying success rate should
be
during validator selection. The metric is collected every time a Pigeon
reports
a message as processed, either successfully or not. 

The success rate goes up by 1% for every successfully relayed message,
and down
by 2% for every failed attempt. The rate is capped between 0% and 100%,
and will
slowly shift back towards a base rate of 50% from either end over the
period of
1000 messages. 

### ExecutionTime

I have no idea what this one was intended to represent. My guess is
pigeon
relaying time (i.e. faster → better). The metrics work much the same way
as
on success rate, with a moving average built on the last 100 messages
delivered,
self purging after 1000 messages.

### FeatureSet

This weight represents how important pigeon feature sets (MEV,
TXtype,etc…)
should be during validator selection. 

> [!WARNING]  
> This is still a work in progress.


# Testing completed

- [x] test coverage exists or has been added/updated
- [x] tested in a private testnet

# Breaking changes

- [x] I have checked my code for breaking changes
- [x] If there are breaking changes, there is a supporting migration.
byte-bandit added a commit to palomachain/paloma that referenced this issue Feb 8, 2024
# Related Github tickets

- VolumeFi#421

# Background

Some of our metrics were calculated incorrectly. This fix should address
the issue.

# Testing completed

- [x] test coverage exists or has been added/updated
- [ ] tested in a private testnet

# Breaking changes

- [x] I have checked my code for breaking changes
- [x] If there are breaking changes, there is a supporting migration.
@verabehr verabehr closed this as completed Feb 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants