Skip to content
This repository has been archived by the owner on Jun 7, 2023. It is now read-only.

document requirements of model, what do we need, how much of it? #16

Closed
Tracked by #12
goern opened this issue Jul 6, 2021 · 4 comments
Closed
Tracked by #12

document requirements of model, what do we need, how much of it? #16

goern opened this issue Jul 6, 2021 · 4 comments
Assignees
Labels
kind/feature Categorizes issue or PR as related to a new feature. lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. priority/critical-urgent Highest priority. Must be actively worked on as someone's top priority right now.

Comments

@goern
Copy link
Member

goern commented Jul 6, 2021

No description provided.

@goern goern added kind/feature Categorizes issue or PR as related to a new feature. priority/critical-urgent Highest priority. Must be actively worked on as someone's top priority right now. labels Jul 6, 2021
@goern goern changed the title document requirements of model, what do we need, how much of it? @saurabhpujar document requirements of model, what do we need, how much of it? Jul 6, 2021
@saurabhpujar
Copy link
Contributor

saurabhpujar commented Jul 6, 2021

Before talking about the model requirements, let me first describe the different ways we can train the system.

  1. Custom project training: Train on individual projects, and apply on the same project. (one to one)
  2. General project training: Train on a set of one or more projects, apply on a different set of one or more projects.

Custom Project Training:

  • Gives the best result.
  • Captures the idiosyncrasies of of individual projects.
  • Requires a lot of training data for each individual project.
  • Does not generalize well to other projects.
  • Requires more time to implement on a new project.

General Project Training:

  • Results not as good as custom training.
  • Captures general attributes of each project, which maybe common with other projects.
  • Training data from multiple projects are combined together.
  • Generalizes well to other projects.
  • Requires less time to implement on a new project.

Model Requirements

The performance of the ML/DL models have 2 preconditions:

  1. Issue(Bug) Count: the number of samples available for training.
  2. Negative/Positive ratio: The ratio of negative samples (0 labels) to positive samples (1 labels). We also call it False Positive/True Positive Ratio.

Issue Count:

  • Generally, the more samples available the better.
  • We have tried our models on projects with approximately 10k samples and they have given good results.
  • The best results were for Libtiff which had 12,500 samples.

Negative/Positive Ratio:

  • The closer the ration is to 1/1, the better. This will be called a balanced dataset.
  • For the bug/vulnerability detection problem, there are way more examples of non-buggy code than buggy code (luckily) which is why the dataset is almost always heavily unbalanced.
  • The more unbalanced the data, the worse the results.
  • Our best results are for Libtiff which has a ratio of 20/1.
  • We have obtained good results for datasets with ratio of up to 54/1.

Comments:

  • If the issue count is less, custom project training option is unavailable to us and we will have to rely on general project training. This is the case with grep(2,441 issues), crun(3513 issues) and fuse-overlayfs(727 issues).

  • We get poor results when the issue count is very high and the negative/positive ratio is also very high. The worst results are for FFMpeg which has about 500,000 (check number) examples and a N/P ratio of 120/1. In such cases we can restrict the bug types under consideration to improve the results.

  • Another thing to note is that we can know both these numbers only after analyzing a project with a static analyzer and the D2A auto-labeler.

@sesheta
Copy link

sesheta commented Oct 16, 2021

Issues go stale after 90d of inactivity.
Mark the issue as fresh with /remove-lifecycle stale.
Stale issues rot after an additional 30d of inactivity and eventually close.

If this issue is safe to close now please do so with /close.

/lifecycle stale

@sesheta sesheta added the lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. label Oct 16, 2021
@saurabhpujar
Copy link
Contributor

/close

@sesheta
Copy link

sesheta commented Oct 18, 2021

@saurabhpujar: Closing this issue.

In response to this:

/close

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.

@sesheta sesheta closed this as completed Oct 18, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
kind/feature Categorizes issue or PR as related to a new feature. lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. priority/critical-urgent Highest priority. Must be actively worked on as someone's top priority right now.
Projects
None yet
Development

No branches or pull requests

3 participants