Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Network duplicate filter for publish messages #2643

Merged

Commits on Mar 6, 2020

  1. Filter duplicate publish messages before deserializing

    When a message is unique, the digest is saved and passed around to network processing, which may drop it if the block processor is full.
    
    Cleaning up a long unchecked block erases its digest from the publish filter.
    
    The blocks_filter has been removed due to redundancy. The size of this filter is 256k, which uses about 4MB.
    guilhermelawless committed Mar 6, 2020
    Configuration menu
    Copy the full SHA
    ab43d93 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    282e008 View commit details
    Browse the repository at this point in the history

Commits on Mar 10, 2020

  1. Configuration menu
    Copy the full SHA
    4573ada View commit details
    Browse the repository at this point in the history