Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Believe this is an issue ... findings change with each scan of same SBOM #2519

Closed
2 tasks done
charbl opened this issue Feb 23, 2023 · 24 comments · Fixed by #3357
Closed
2 tasks done

Believe this is an issue ... findings change with each scan of same SBOM #2519

charbl opened this issue Feb 23, 2023 · 24 comments · Fixed by #3357
Assignees
Labels
defect Something isn't working pending release
Milestone

Comments

@charbl
Copy link

charbl commented Feb 23, 2023

Current Behavior

So, we have a program using our Jenkins. They produce an SBOM using CycloneDX integrated into their build process. Actually, it's 3 SBOMS and then they use the ClycloneDX CLI to merge into one SBOM. Long story but they report the results into a parent project in DependencyTrack and we asked them to report the results into a child project. They did rerun the job and pointed to the child project. The rerun created an SBOM it was submitted to the child project. The child project reported diff results (components and vulnerabilities) ... this raised the question ... why? We ran the Jenkins job again (publishing to the child project) and the numbers changed again ... they started to align closer to the parent project numbers (still have not removed the parent analysis yet). We started to question the SBOM that were generated from the two different builds, we pulled them and did a compare .... identical aside from the metadata which had different serial numbers.

So, to really test this out, we created a dummy Jenkins job that just report the same SBOM to two different test jobs in DependencyTrack (two projects under parent project). Ran test for project one, sending over the SBOM to test project one and looked in DependencyTrack. The project one was now visible, and we compared the reported numbers to the original project ...
This is what we expected:
Components - 1749
Vulnerabilities: Crti: 11, High: 33, Med 28
Risk Score: 364

This is waht we got on first submission:
Components - 2498
Vulnerabilities: Crti: 23, High: 48, Med 37
Risk Score: 725

Now this was kind of shocking, we poked around and double checked settings and the SBOM diff.
We decided to run our test job and report to test job two .... here are the results we got:

This is waht we got on first submission:
Components - 2312
Vulnerabilities: Crti: 28, High: 43, Med 35
Risk Score: 745

Again, shocking that these two were different. And again, double check all .... same conclusion .... config looks good.
We then decided to run our two jobs and re-report the same SBOM to their DependencyTrack projects ... that already had data!

both runs now reported new numbers below for the second submission:

Job 1
Components - 1764
Vulnerabilities: Crti: 12, High: 35, Med 29
Risk Score: 387

Job 2

Components - 1754
Vulnerabilities: Crti: 11, High: 33, Med 28
Risk Score: 364

They pulled closer to the expected but still different .... I can't explain. Same SBOM ....

Thoughts? Have read all the documentation and are we doing something wrong?

Steps to Reproduce

  1. Create job in Jenkins and use the DepTrack plugin to publish SBOM (use Sync publish)
  2. Create another job in Jenkins and use the DepTrack plugin to publish SAME SBOM (use Sync publish)
  3. after success, log into Dependency Track and view findings
  4. you will note the findings .... we refreshed to ensure the background analysis was complete - We had two different results
  5. Re-run the two Jenkins jobs as is .... should be a republish of the same SBOM (use Sync publish)
  6. Again, go check DepTrack .... compare first results and you might be surprised to find them different

Expected Behavior

See above

Dependency-Track Version

4.7.1

Dependency-Track Distribution

Container Image

Database Server

PostgreSQL

Database Server Version

RDS 13.7

Browser

Microsoft Edge

Checklist

@charbl charbl added defect Something isn't working in triage labels Feb 23, 2023
@rkg-mm
Copy link
Contributor

rkg-mm commented Feb 24, 2023

can you check if there is any error logged in the logfile after uploading (could be several minutes later as processing the content is asynchronous)? we saw similar stuff in #2131 when reuploading SBOMs to the same project. We fixed that in 4.7, but If the SBOM processing fails for your SBOM at some point that could be a reason.

@charbl
Copy link
Author

charbl commented Feb 27, 2023

The jobs we testes with had Synchronous turned on and all record successful.

Here is the log for one of the jobs

image

All of them have same result in Jenkins

@rkg-mm
Copy link
Contributor

rkg-mm commented Feb 27, 2023

@charbl Those are Jenkins Logs, We would need the Dependency-Track logfiles.

Also: After uploading the BOM it takes a while for it to process, in this time the amount of components will change several times as well, is it constant already when you check it?

@charbl
Copy link
Author

charbl commented Feb 27, 2023

as soon as I can get access, will check logs

@charbl
Copy link
Author

charbl commented Feb 27, 2023

see this when our job runs

2023-02-27 18:24:13,172 INFO [BomUploadProcessingTask] Processing CycloneDX BOM uploaded to project: 3d127377-ffd6-4974-90dc-c58407cdc178
2023-02-27 18:24:38,541 INFO [BomUploadProcessingTask] Identified 0 new components
2023-02-27 18:24:38,541 INFO [BomUploadProcessingTask] Processing CycloneDX dependency graph for project: 3d127377-ffd6-4974-90dc-c58407cdc178
2023-02-27 18:24:38,674 INFO [BomUploadProcessingTask] Processed 1767 components and 0 services uploaded to project 3d127377-ffd6-4974-90dc-c58407cdc178
2023-02-27 18:24:38,675 INFO [RepositoryMetaAnalyzerTask] Performing component repository metadata analysis against 1767 components
2023-02-27 18:24:38,704 ERROR [LoggableUncaughtExceptionHandler] An unknown error occurred in an asynchronous event or notification thread
org.apache.lucene.store.AlreadyClosedException: Underlying file changed by an external force at 2023-02-25T21:54:15.762260036Z, (lock=NativeFSLock(path=/data/.dependency-track/index/servicecomponent/write.lock,impl=sun.nio.ch.FileLockImpl[0:9223372036854775807 exclusive valid],creationTime=2023-02-19T18:54:17.400019264Z))
at org.apache.lucene.store.NativeFSLockFactory$NativeFSLock.ensureValid(NativeFSLockFactory.java:191)
at org.apache.lucene.store.LockValidatingDirectoryWrapper.syncMetaData(LockValidatingDirectoryWrapper.java:61)
at org.apache.lucene.index.SegmentInfos.prepareCommit(SegmentInfos.java:802)
at org.apache.lucene.index.IndexWriter.startCommit(IndexWriter.java:5084)
at org.apache.lucene.index.IndexWriter.prepareCommitInternal(IndexWriter.java:3460)
at org.apache.lucene.index.IndexWriter.commitInternal(IndexWriter.java:3770)
at org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:3728)
at org.dependencytrack.search.IndexManager.commit(IndexManager.java:249)
at org.dependencytrack.tasks.IndexTask.inform(IndexTask.java:65)
at alpine.event.framework.BaseEventService.lambda$publish$0(BaseEventService.java:101)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
2023-02-27 18:24:40,646 INFO [InternalAnalysisTask] Starting internal analysis task
2023-02-27 18:24:40,646 INFO [InternalAnalysisTask] Analyzing 1767 component(s)
[ec2-user@ip-10-101-0-241 ~]$

@charbl
Copy link
Author

charbl commented Mar 1, 2023

Any thoughts?

@charbl
Copy link
Author

charbl commented Mar 10, 2023

Any idea? Is this the issue?

@walterdeboer
Copy link

walterdeboer commented Mar 11, 2023

Looks wrong to me. Any idea why "Underlying file changed by an external force"? Do you have two instances running at once? Or is it something in your SBOM? Maybe attach it to this issue?

@nscuro
Copy link
Member

nscuro commented Mar 11, 2023

The AlreadyClosedException is unrelated, but a fix will be shipped with 4.8.0. There's a workaround available that I mentioned here: #2379 (comment)

I think the underlying issue is #2131, as pointed out by @rkg-mm. Until now there's only a workaround in place that prevents the SBOM processing to fail entirely, but the root cause is still unaddressed.

@charbl
Copy link
Author

charbl commented Mar 14, 2023

So, I redid our test and was watching logs. Did see the AlreadyClosedException on each BOM upload. Did capture the logs as each BOM was uploaded and processed. Will attach the document that has each log entries as the BOM was processed.
Interesting was that both BOM's show same findings but both are wrong when compared to the original SBOM loaded. I have embeded screen captures of the tabs (in document) that show the numbers of the two test.

this is the org (after all the number changing)
image

So, we are calling this a bug? Have a customer that is wanting answers to why the numbers are different for same SBOM and trying to answer them.

DT-BASE.txt
Could not imbed screen captures in doc so here are the two new results:

SBOM 1
image

SBOM 2
image

@rkg-mm
Copy link
Contributor

rkg-mm commented Mar 14, 2023

@charbl another shot: Does your BOM contain duplicates, in terms of multiple components with identical PURLs?
On first upload all instances will be added, on following uploads some de-doubling happens due to the way existing components are identified, which may result in lower numbers too

@charbl
Copy link
Author

charbl commented Mar 14, 2023

would it be helpful if I uploaded an scrubbed copy of the SBOM? Masked the Application Name and some of the url paths related to local resolution.

@rkg-mm
Copy link
Contributor

rkg-mm commented Mar 14, 2023

@charbl yes, as long as PURLS remain the same. You can remove internal components (maybe double check if you can reproduce it in the end with the scrubbed BOM)

@charbl
Copy link
Author

charbl commented Mar 14, 2023

ran test on scrubbed BOM and got same results
cleanSBOM.zip

@rkg-mm
Copy link
Contributor

rkg-mm commented Mar 14, 2023

I think this confirms my assumption:
You have duplicates in the BOM file, which get scrubbed on followup uploads.

For example, look at array-unique@0.2.1. You find it in many variations as bom-refs like
jest-cli@23.6.0|array-unique@0.2.1
jest-message-util@23.4.0|array-unique@0.2.1
etc..

PURL is always: pkg:npm/array-unique@0.2.1?vcs_url=git://github.com/jonschlinkert/array-unique.git

This is due to the way the NPM cyclonedx tool creates those BOMs now, seeing the same sub-component within different NPM component not as identical entry, because there are possibilities how this "could" in theory not be the same component (a modified one). We had a lot of discussions about the way how this is done by the cyclonedx npm tool, and wether this should be the case, but as of now it is this way. on first upload an entry for each of the duplicates is created (you can confirm this by searching an example component like above in the component list, you will find multiple entries. On second upload, Dtrack first looks for identical components in the Database for the project, to know if it should add a new one or can re-use the existing one. It will find the existing ones, and not create a new one - and further, this leads to cleanup of the data as well, where duplicate components get removed. Not ideal this way, but the result is not wrong, its just de-duplicated.

Related to:
#2265
CycloneDX/cyclonedx-node-npm#306

Background discussions can be found in:
CycloneDX/cyclonedx-node-npm#307
https://github.com/CycloneDX/cyclonedx-node-npm/milestone/2

@charbl
Copy link
Author

charbl commented Mar 14, 2023

so best practice for now might be to load the BOM twice at first to get more accurate findings? I have no problem with this as "this is how it's done for now" and just want to ensure this is passed to the applications we support.

@rkg-mm
Copy link
Contributor

rkg-mm commented Mar 14, 2023

@charbl Probably. I don't know a better way for pure NPM projects as of now. But I am not 100% confident the duplicates are completely cleaned up after 2nd upload already, maybe needs confirmation.
I saw some references for webpack-plugins in your bom. If you use webpack, the better way could be to use the cycloneDX webpack plugin. First: this does not show this behaviour. Second: it also would reduce the amount of tracked components since webpack removed unnecessary components during build time.

@charbl
Copy link
Author

charbl commented Mar 14, 2023

10-4 .... will try recommendations and re-run to come up with best practices for where we are. thanx

@nscuro
Copy link
Member

nscuro commented Jul 5, 2023

There is an additional complication in the BOM provided by @charbl:

xmlbuilder2 appears twice in the BOM:

  • Once with PURL pkg:npm/xmlbuilder2@3.0.2
  • Once with PURL pkg:npm/xmlbuilder2@3.0.2?vcs_url=git://github.com/oozcitak/xmlbuilder2.git
<component type="library" bom-ref="xmlbuilder2@3.0.2">
  <author>Ozgur Ozcitak &lt;oozcitak@gmail.com&gt;</author>
  <name>xmlbuilder2</name>
  <version>3.0.2</version>
  <description>An XML builder for node.js</description>
  <hashes>
    <hash alg="SHA-512">8783146b0198db5093761578c66dc31bd760b2aca10e466f549071f3c6dea97f3026cdd58321908009f956b787b9a7baba74d0c61d75e5a30ae489c25663882b</hash>
  </hashes>
  <licenses>
    <license>
      <id>MIT</id>
    </license>
  </licenses>
  <purl>pkg:npm/xmlbuilder2@3.0.2</purl>
  <externalReferences>
    <reference type="vcs">
      <url>git://github.com/oozcitak/xmlbuilder2.git</url>
      <comment>as detected from PackageJson property "repository.url"</comment>
    </reference>
    <reference type="website">
      <url>http://github.com/oozcitak/xmlbuilder2</url>
      <comment>as detected from PackageJson property "homepage"</comment>
    </reference>
    <reference type="issue-tracker">
      <url>http://github.com/oozcitak/xmlbuilder2/issues</url>
      <comment>as detected from PackageJson property "bugs.url"</comment>
    </reference>
    <reference type="distribution">
      <url>https://registry.npmjs.org/xmlbuilder2/-/xmlbuilder2-3.0.2.tgz</url>
      <comment>as detected from npm-ls property "resolved"</comment>
    </reference>
  </externalReferences>
  <properties>
    <property name="cdx:npm:package:path">node_modules/xmlbuilder2</property>
  </properties>
</component>
<component type="library" bom-ref="xmlbuilder2@3.0.2">
  <author>Ozgur Ozcitak &lt;oozcitak@gmail.com&gt;</author>
  <name>xmlbuilder2</name>
  <version>3.0.2</version>
  <description>An XML builder for node.js</description>
  <licenses>
    <license>
      <id>MIT</id>
    </license>
  </licenses>
  <purl>pkg:npm/xmlbuilder2@3.0.2?vcs_url=git://github.com/oozcitak/xmlbuilder2.git</purl>
  <externalReferences>
    <reference type="vcs">
      <url>git://github.com/oozcitak/xmlbuilder2.git</url>
      <comment>as detected from PackageJson property "repository.url"</comment>
    </reference>
    <reference type="website">
      <url>http://github.com/oozcitak/xmlbuilder2</url>
      <comment>as detected from PackageJson property "homepage"</comment>
    </reference>
    <reference type="issue-tracker">
      <url>http://github.com/oozcitak/xmlbuilder2/issues</url>
      <comment>as detected from PackageJson property "bugs.url"</comment>
    </reference>
  </externalReferences>
  <components>
    <component type="library" bom-ref="xmlbuilder2@3.0.2|js-yaml@3.14.0">
      <author>Vladimir Zapparov &lt;dervus.grim@gmail.com&gt;</author>
      <name>js-yaml</name>
      <version>3.14.0</version>
      <description>YAML 1.2 parser and serializer</description>
      <licenses>
        <license>
          <id>MIT</id>
        </license>
      </licenses>
      <purl>pkg:npm/js-yaml@3.14.0</purl>
      <externalReferences>
        <reference type="vcs">
          <url>nodeca/js-yaml</url>
          <comment>as detected from PackageJson property "repository"</comment>
        </reference>
        <reference type="website">
          <url>https://github.com/nodeca/js-yaml</url>
          <comment>as detected from PackageJson property "homepage"</comment>
        </reference>
        <reference type="distribution">
          <url>https://registry.npmjs.org/js-yaml/-/js-yaml-3.14.0.tgz</url>
          <comment>as detected from npm-ls property "resolved"</comment>
        </reference>
      </externalReferences>
      <properties>
        <property name="cdx:npm:package:path">node_modules/xmlbuilder2/node_modules/js-yaml</property>
      </properties>
    </component>
  </components>
  <properties>
    <property name="cdx:npm:package:path">node_modules/xmlbuilder2</property>
  </properties>
</component>

In both cases, there's also an externalReference pointing to git://github.com/oozcitak/xmlbuilder2.git. So not sure why one PURL has the additional qualifier and the other doesn't.

Given this (#2131 (comment)) query performed during BOM processing:

project == :project && ((purl != null && purl == :purl) || (purlCoordinates != null && purlCoordinates == :purlCoordinates) || (swidTagId != null && swidTagId == :swidTagId) || (cpe != null && cpe == :cpe) || (group == :group && name == :name && version == :version))

Only ever one of the two occurrences would match after the first BOM upload. Because purlCoordinates and name, version are identical, but purl is not (one has qualifiers, the other doesn't).

There's a trade-off here:

If we want accurate matching, the correlation query above needs to be more strict. More along the lines of:

project == :project && ((purl != null && purl == :purl) && (group == null && name == :name && version == :version))

In other words, all identities must match, not only individual parts.

However, this will make "updating" of existing components with new identities impossible. If this component already exists:

{
  "name": "foo",
  "version": "1.2.3"
}

then uploading the following component:

{
  "name": "foo",
  "version": "1.2.3",
  "purl": "pkg:maven/acme/foo@1.2.3"
}

will cause a new component to be created (and the old one to be deleted), instead of merging them.

IMO, stricter matching is the more desirable option. Merging incomplete identities makes the whole process hard to reason about, and the state of the project is not really deterministic.

@nscuro nscuro removed the in triage label Jul 5, 2023
@nscuro
Copy link
Member

nscuro commented Jul 5, 2023

Thinking about this more, I believe there are certain cases where one might want component identity matching to be "lax", e.g. when searching for components in the portfolio, and others where matching should be "strict", e.g. when importing BOMs.

Any thoughts?

@rkg-mm
Copy link
Contributor

rkg-mm commented Jul 5, 2023

Import & Tree:
If some of the "core" information really differs (e.g. PURL), I think it's fine to track it as 2 instances. Source needs to fix the data here. Deduplication only should trigger if the core information is really identical.
I think changing a core information from an existing component is a very rare case.

Search:
For searching or similar I think the same logic would be fine. There might be rare cases where it would be nice to match them during search, but I don't think it's worth the effort

nscuro added a commit to DependencyTrack/hyades-apiserver that referenced this issue Jul 5, 2023
To address DependencyTrack/dependency-track#2519 (comment).

Also add regression test for this specific issue.

Signed-off-by: nscuro <nscuro@protonmail.com>
nscuro added a commit to DependencyTrack/hyades-apiserver that referenced this issue Jul 7, 2023
To address DependencyTrack/dependency-track#2519 (comment).

Also add regression test for this specific issue.

Signed-off-by: nscuro <nscuro@protonmail.com>
nscuro added a commit to DependencyTrack/hyades-apiserver that referenced this issue Jul 8, 2023
To address DependencyTrack/dependency-track#2519 (comment).

Also add regression test for this specific issue.

Signed-off-by: nscuro <nscuro@protonmail.com>
nscuro added a commit to DependencyTrack/hyades-apiserver that referenced this issue Jul 10, 2023
* Add bloated BOM for ingestion performance testing

Signed-off-by: nscuro <nscuro@protonmail.com>

* Prevent query compilation cache being bypassed for `matchSingleIdentity` queries

See DependencyTrack/dependency-track#2540

This also cleans the query from containing weird statements like `(cpe != null && cpe == null)` in case a component does not have a CPE.

Signed-off-by: nscuro <nscuro@protonmail.com>

* WIP: Improve BOM processing performance

Signed-off-by: nscuro <nscuro@protonmail.com>

* Handle dependency graph

Signed-off-by: nscuro <nscuro@protonmail.com>

* Improve dependency graph assembly

Instead of using individual bulk UPDATE queries, use setters on persistent components instead. This way we can again make use of batched flushing.

Signed-off-by: nscuro <nscuro@protonmail.com>

* Completely replace old processing logic

Also decompose large processing method into multiple smaller ones, and re-implement notifications.

Signed-off-by: nscuro <nscuro@protonmail.com>

* Fix not all BOM refs being updated with new component identities

Signed-off-by: nscuro <nscuro@protonmail.com>

* Be smarter about indexing component identities and BOM refs

Also add more documentation

Signed-off-by: nscuro <nscuro@protonmail.com>

* Reduce logging noise

Signed-off-by: nscuro <nscuro@protonmail.com>

* Mark new components as such

... via new transient field. Required for compatibility with #217

Signed-off-by: nscuro <nscuro@protonmail.com>

* Compatibility with #217

Signed-off-by: nscuro <nscuro@protonmail.com>

* Cleanup tests

Signed-off-by: nscuro <nscuro@protonmail.com>

* Reduce code duplication

Signed-off-by: nscuro <nscuro@protonmail.com>

* Cleanup; Process services

Signed-off-by: nscuro <nscuro@protonmail.com>

* Finishing touches 🪄

Signed-off-by: nscuro <nscuro@protonmail.com>

* Make flush threshold configurable

The optimal value could depend on how beefy the database server is, and how much memory is available to the API server.

Signed-off-by: nscuro <nscuro@protonmail.com>

* Clarify `warn` log when rolling back active transactions

Signed-off-by: nscuro <nscuro@protonmail.com>

* Log number of consumed components and services before and after de-dupe

Signed-off-by: nscuro <nscuro@protonmail.com>

* Extend BOM processing test with bloated BOM

Signed-off-by: nscuro <nscuro@protonmail.com>

* Make component identity matching strict

To address DependencyTrack/dependency-track#2519 (comment).

Also add regression test for this specific issue.

Signed-off-by: nscuro <nscuro@protonmail.com>

* Add regression test for DependencyTrack/dependency-track#1905

Signed-off-by: nscuro <nscuro@protonmail.com>

* Clarify why "reachability on commit" is disabled; Add assertion for persistent object state

Signed-off-by: nscuro <nscuro@protonmail.com>

* Add tests for `equals` and `hashCode` of `ComponentIdentity`

Signed-off-by: nscuro <nscuro@protonmail.com>

* Address review comments

Signed-off-by: nscuro <nscuro@protonmail.com>

---------

Signed-off-by: nscuro <nscuro@protonmail.com>
@melba-lopez
Copy link
Contributor

@nscuro has this been addressed per DependencyTrack/hyades-apiserver#218? and would this be a 4.10 potential fix?

@nscuro
Copy link
Member

nscuro commented Aug 2, 2023

@melba-lopez Yes, this should be fixed in the PR you linked (we have a regression test specifically for this issue). We're planning to backport it (and multiple other improvements we made for Hyades) to DT.

mehab pushed a commit to DependencyTrack/hyades-apiserver that referenced this issue Sep 12, 2023
* Add bloated BOM for ingestion performance testing

Signed-off-by: nscuro <nscuro@protonmail.com>

* Prevent query compilation cache being bypassed for `matchSingleIdentity` queries

See DependencyTrack/dependency-track#2540

This also cleans the query from containing weird statements like `(cpe != null && cpe == null)` in case a component does not have a CPE.

Signed-off-by: nscuro <nscuro@protonmail.com>

* WIP: Improve BOM processing performance

Signed-off-by: nscuro <nscuro@protonmail.com>

* Handle dependency graph

Signed-off-by: nscuro <nscuro@protonmail.com>

* Improve dependency graph assembly

Instead of using individual bulk UPDATE queries, use setters on persistent components instead. This way we can again make use of batched flushing.

Signed-off-by: nscuro <nscuro@protonmail.com>

* Completely replace old processing logic

Also decompose large processing method into multiple smaller ones, and re-implement notifications.

Signed-off-by: nscuro <nscuro@protonmail.com>

* Fix not all BOM refs being updated with new component identities

Signed-off-by: nscuro <nscuro@protonmail.com>

* Be smarter about indexing component identities and BOM refs

Also add more documentation

Signed-off-by: nscuro <nscuro@protonmail.com>

* Reduce logging noise

Signed-off-by: nscuro <nscuro@protonmail.com>

* Mark new components as such

... via new transient field. Required for compatibility with #217

Signed-off-by: nscuro <nscuro@protonmail.com>

* Compatibility with #217

Signed-off-by: nscuro <nscuro@protonmail.com>

* Cleanup tests

Signed-off-by: nscuro <nscuro@protonmail.com>

* Reduce code duplication

Signed-off-by: nscuro <nscuro@protonmail.com>

* Cleanup; Process services

Signed-off-by: nscuro <nscuro@protonmail.com>

* Finishing touches 🪄

Signed-off-by: nscuro <nscuro@protonmail.com>

* Make flush threshold configurable

The optimal value could depend on how beefy the database server is, and how much memory is available to the API server.

Signed-off-by: nscuro <nscuro@protonmail.com>

* Clarify `warn` log when rolling back active transactions

Signed-off-by: nscuro <nscuro@protonmail.com>

* Log number of consumed components and services before and after de-dupe

Signed-off-by: nscuro <nscuro@protonmail.com>

* Extend BOM processing test with bloated BOM

Signed-off-by: nscuro <nscuro@protonmail.com>

* Make component identity matching strict

To address DependencyTrack/dependency-track#2519 (comment).

Also add regression test for this specific issue.

Signed-off-by: nscuro <nscuro@protonmail.com>

* Add regression test for DependencyTrack/dependency-track#1905

Signed-off-by: nscuro <nscuro@protonmail.com>

* Clarify why "reachability on commit" is disabled; Add assertion for persistent object state

Signed-off-by: nscuro <nscuro@protonmail.com>

* Add tests for `equals` and `hashCode` of `ComponentIdentity`

Signed-off-by: nscuro <nscuro@protonmail.com>

* Address review comments

Signed-off-by: nscuro <nscuro@protonmail.com>

---------

Signed-off-by: nscuro <nscuro@protonmail.com>
Signed-off-by: mehab <meha.bhargava@citi.com>
@nscuro nscuro added this to the 4.11 milestone Jan 7, 2024
Copy link
Contributor

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Mar 26, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
defect Something isn't working pending release
Projects
None yet
5 participants