Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BOM upload fails without feedback due to field max length #1665

Open
ecaisse opened this issue May 27, 2022 · 6 comments
Open

BOM upload fails without feedback due to field max length #1665

ecaisse opened this issue May 27, 2022 · 6 comments
Labels
defect Something isn't working

Comments

@ecaisse
Copy link

ecaisse commented May 27, 2022

A BOM file that contains a component with a "publisher" field with more than 255 character fails due to the constraints of the field. However, there is no feedback or way to know that the BOM upload failed other than figuring it out from the log file.

Current Behavior:

When uploading a broken BOM file (see bom-broken.xml in bom-broken.zip), no components will be loaded, and there is no way to see that the BOM processing failed outside the log file. See "Additional Details" section for the stacktrace.

I manually created the BOM file to only include the broken component, however, this is how https://github.com/CycloneDX/cyclonedx-dotnet would generate the component for Hangfire.PostgreSql@1.9.6.

Steps to Reproduce:

  • Create a BOM file with a component that has a publisher value of more than 255 characters
    -- Or use the bom-broken.xml in bom-broken.zip
  • Upload the BOM file to Dependency Track manually in a test project
  • Note that no errors are raised, and no components are loaded

To test that the issue is with the publisher field, simply truncate the field in the XML file and reupload it to Dependency Track. The component should appear correctly in the Components tab.

Expected Behavior:

I expect one of 2 behaviors:

  1. The BOM processing should succeed and limits put on fields should be removed unless explicitly stated by the CycloneDX specification
  2. There should be a way to see that the BOM processing failed in the frontend, especially when uploaded manually

I think option 1 is better than option 2. The restriction makes it a problem for using automation as there is no way to predict if a component will break Dependency Track. Since the specification does not contain restrictions on field lengths, Dependency Track should not enforce arbitrary ones.

Environment:

  • Dependency-Track Version: 4.4.2 and 4.5.0
  • Distribution: Docker
  • BOM Format & Version: XML 1.3
  • Database Server: PostgreSQL (but should apply to all)
  • Browser: Chrome 100.0.4896.127

Additional Details:

DT stacktrace

2022-05-27 16:37:25,861 [] ERROR [org.dependencytrack.tasks.BomUploadProcessingTask] Error while processing bom
javax.jdo.JDOFatalUserException: Attempt to store value "Frank Hommers and others (Burhan Irmikci (barhun), Zachary Sims(zsims), kgamecarter, Stafford Williams (staff0rd), briangweber, Viktor Svyatokha (ahydrax), Christopher Dresel (Dresel), Vytautas Kasparavičius (vytautask), Vincent Vrijburg, David Roth (davidroth)." in column "PUBLISHER" that has maximum length of 255. Please correct your data!
        at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:615)
        at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:720)
        at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:740)
        at alpine.persistence.AbstractAlpineQueryManager.persist(AbstractAlpineQueryManager.java:417)
        at org.dependencytrack.persistence.ComponentQueryManager.createComponent(ComponentQueryManager.java:320)
        at org.dependencytrack.persistence.QueryManager.createComponent(QueryManager.java:379)
        at org.dependencytrack.tasks.BomUploadProcessingTask.processComponent(BomUploadProcessingTask.java:170)
        at org.dependencytrack.tasks.BomUploadProcessingTask.inform(BomUploadProcessingTask.java:124)
        at alpine.event.framework.BaseEventService.lambda$publish$0(BaseEventService.java:99)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
        at java.base/java.lang.Thread.run(Unknown Source)
Caused by: org.datanucleus.exceptions.NucleusUserException: Attempt to store value "Frank Hommers and others (Burhan Irmikci (barhun), Zachary Sims(zsims), kgamecarter, Stafford Williams (staff0rd), briangweber, Viktor Svyatokha (ahydrax), Christopher Dresel (Dresel), Vytautas Kasparavičius (vytautask), Vincent Vrijburg, David Roth (davidroth)." in column "PUBLISHER" that has maximum length of 255. Please correct your data!
        at org.datanucleus.store.rdbms.mapping.column.CharColumnMapping.setString(CharColumnMapping.java:253)
        at org.datanucleus.store.rdbms.mapping.java.SingleFieldMapping.setString(SingleFieldMapping.java:183)
        at org.datanucleus.store.rdbms.fieldmanager.ParameterSetter.storeStringField(ParameterSetter.java:158)
        at org.datanucleus.state.StateManagerImpl.providedStringField(StateManagerImpl.java:1853)
        at org.dependencytrack.model.Component.dnProvideField(Component.java)
        at org.dependencytrack.model.Component.dnProvideFields(Component.java)
        at org.datanucleus.state.StateManagerImpl.provideFields(StateManagerImpl.java:2528)
        at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:352)
        at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObjectInTable(RDBMSPersistenceHandler.java:162)
        at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:138)
        at org.datanucleus.state.StateManagerImpl.internalMakePersistent(StateManagerImpl.java:4569)
        at org.datanucleus.state.StateManagerImpl.makePersistent(StateManagerImpl.java:4546)
        at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2026)
        at org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:1869)
        at org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextImpl.java:1724)
        at org.datanucleus.ExecutionContextThreadedImpl.persistObject(ExecutionContextThreadedImpl.java:219)
        at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:715)
        ... 10 common frames omitted
@ringerl
Copy link

ringerl commented Feb 1, 2023

Any updates on this one? - also the PURL field/column is affected:

2023-02-01 12:49:33,255 ERROR [BomUploadProcessingTask] Error while processing bomjavax.jdo.JDOFatalUserException: Attempt to store value "pkg:npm/%40types/testing-library__jest-dom@5.14.5?download_url=https%3A%2F%2Fartifactory.power.inet%3A443%2Fartifactory%2Fapi%2Fnpm%2Fnpm-viopt%2F%40types%2Ftesting-library__jest-dom%2F-%2Ftesting-library__jest-dom-5.14.5.tgz#types/testing-library__jest-dom" in column ""PURL"" that has maximum length of 255. Please correct your data!

@nscuro nscuro added defect Something isn't working and removed in triage labels Feb 1, 2023
@nscuro
Copy link
Member

nscuro commented Feb 1, 2023

No progress until now.

If you're using the cyclonedx-node-npm module to generate your BOMs, it supports the --short-PURLs flag for exactly this purpose: https://github.com/CycloneDX/cyclonedx-node-npm#usage

@esnible
Copy link

esnible commented Jun 12, 2023

Similar problem if the metadata.component.name field is long.

2023-06-12 18:10:25,423 ERROR [GlobalExceptionHandler] Uncaught internal server error
javax.jdo.JDOFatalUserException: Attempt to store value "GH_mydb_v11571_linuxamd64_image, GH_mydb_v1158, GH_oemtools_v1158, mydb/v1158, mydb_main_test, mydb_master, mydb_test, mydb_test2, mydb_v1158, oemtools/v1158, oemtools_master, oemtools_test2, oemtools_v1158, pipeline_mydb_scans, pipeline_oemtools_scans, sbom_scan, tracker_20261" in column ""NAME"" that has maximum length of 255. Please correct your data!
	at org.datanucleus.api.jdo.JDOAdapter.getJDOExceptionForNucleusException(JDOAdapter.java:678)
	at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:702)
	at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:722)
	at alpine.persistence.AbstractAlpineQueryManager.persist(AbstractAlpineQueryManager.java:427)
	at org.dependencytrack.persistence.ProjectQueryManager.createProject(ProjectQueryManager.java:431)
...
Caused by: org.datanucleus.exceptions.NucleusUserException: Attempt to store value "GH_mydb_v11571_linuxamd64_image, GH_mydb_v1158, GH_oemtools_v1158, mydb/v1158, mydb_main_test, mydb_master, mydb_test, mydb_test2, mydb_v1158, oemtools/v1158, oemtools_master, oemtools_test2, oemtools_v1158, pipeline_mydb_scans, pipeline_oemtools_scans, sbom_scan, tracker_20261" in column ""NAME"" that has maximum length of 255. Please correct your data!

Here is an SBOM that triggers the problem:

{
  "bomFormat": "CycloneDX",
  "specVersion": "1.4",
  "version": 1,
  "metadata": {
    "tools": [
      {
        "vendor": "MyCompany",
        "name": "MyTool SBOM Generator",
        "version": "0.0.1"
      }
    ],
    "component": {
      "type": "application",
      "name": "GH_mydb_v11571_linuxamd64_image, GH_mydb_v1158, GH_oemtools_v1158, mydb/v1158, mydb_main_test, mydb_master, mydb_test, mydb_test2, mydb_v1158, oemtools/v1158, oemtools_master, oemtools_test2, oemtools_v1158, pipeline_mydb_scans, pipeline_oemtools_scans, sbom_scan, tracker_20261",
      "version": "0.0.0.0"
    }
  }
}

The D-T API returns a 500 when I attempt an upload using github.com/DependencyTrack/client-go. A 40x might be better. The D-T UI didn't complain at all.

@savek-cc
Copy link

We just ran into this issue with a node purl including repository names in the purl-string. Is there an actual upper bound for the length of a purl? Crashing and only handling an incomplete SBOM on SBOM import doesn't seem like the best strategy. (There is no way for a product to notice that the BOM import was incomplete because the failing insert-into-db isn't handled gracefully - but just truncates the handled BOM)

@ataraxus
Copy link

jup, just crashed into this issue. PURL is too long...

@sfmcgee
Copy link

sfmcgee commented Mar 15, 2024

We have a Customer required RPM that is installed on our RHEL 8 hosts with a very long name (81 characters). The generated PURL in the SBOM is more than 255 characters (276 characters), so ingestion into Dependency Track breaks for all of our systems. We would be happy with truncating the PURL at 255 characters on ingest or allowing the longer fields.

We also have now enabled an alert for BOM consumption and failures so that we have visibility into the fact that the ingest fails.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
defect Something isn't working
Projects
None yet
Development

No branches or pull requests

7 participants