Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Apply new optimal set of settings after tuning #955

Open
wants to merge 8 commits into
base: master
Choose a base branch
from

Conversation

amidgol
Copy link
Contributor

@amidgol amidgol commented Apr 11, 2024

This change is based on the results of the tuning paper. We found a new set of configs to be optimal, detecting a higher number of faults while keeping a competitive degree of line and branch coverage. This setting is referred to as "best faults" in the paper.

This change is based on the results of the tuning paper. We found a new set of configs to be optimal, detecting a higher number of faults while keeping a competitive degree of line and branch coverage. This setting is referred to as "best faults" in the paper.
@amidgol amidgol requested a review from arcuri82 April 11, 2024 17:25
Copy link
Collaborator

@arcuri82 arcuri82 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@amidgol thanks. you should ask me to review the PR once the CI passes. it fails now. reason is that modifying EMConfig requires to run ConfigToMarkdown to update the documentation. run it, check CI passes, and then ask again for my review

@arcuri82
Copy link
Collaborator

@amidgol some E2E tests fail after the change of parameters. This shouldn't really happen (but might be issue with the E2E).
Few seem related to AdaptiveHypermutation, so you might need to have a chat or help from @man-zhang for looking into it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants