Replies: 3 comments 3 replies
-
Probably I should have posted in the spline-agent repository and not on this one. Any pointers would be really appreciated though. |
Beta Was this translation helpful? Give feedback.
3 replies
-
This worked like a charm. Thanks so much @wajda ! I'm adding the updated snippet here in case anybody finds it useful. |
Beta Was this translation helpful? Give feedback.
0 replies
-
scalaVersion := "2.12.14"
name := "dataopsfilter"
organization := "your.package"
version := "1.0"
// You can define other libraries as dependencies in your build like this:
libraryDependencies += "org.scala-lang.modules" %% "scala-parser-combinators" % "1.1.2"
libraryDependencies += "za.co.absa.spline.agent.spark" %% "agent-core" % "0.6.2" |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Background [Optional]
I'm working on a POC for spline where we want to add metadata using a Postprocessing filter.
This is probably a silly question out of the fact I haven't used java in ages, and I lost touch with the ecosystem :P
Question
I have a Scala source code for a Postprocessing filter
CustomFilter.scala
.I'm unsure what are the steps to be able to add something like:
I just want to be able to iterate fast, edit my filter, run the pyspark shell, and see if it's doing the right thing.
Beta Was this translation helpful? Give feedback.
All reactions