-
Notifications
You must be signed in to change notification settings - Fork 554
Closed
Labels
New IntegrationIntegrating with a new framework or libraryIntegrating with a new framework or library
Description
With the recently released integration for Apache Beam and the general push of Sentry for Data Pipelines - is there an integration of Sentry with Apache Spark (especially with PySpark) planned?
apache/spark#20151 explicitly mentions that one reason for creating that PR was to make a Sentry integration easier. However, I didn't manage to even find some example code for using Sentry to capture Spark errors, and so far I failed when trying to build an integration using the feature introduced by the PR myself.
Metadata
Metadata
Assignees
Labels
New IntegrationIntegrating with a new framework or libraryIntegrating with a new framework or library