New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-12194] [Spark Core] Sink for reporting metrics to OpenTSDB #10187
[SPARK-12194] [Spark Core] Sink for reporting metrics to OpenTSDB #10187
Conversation
Looks great Kapil. Ping @jerryshao @russellcardullo |
Can one of the admins verify this patch? |
I don't know if theirs reviewer bandwidth for this change (I'm not a committer but just don't see a lot of activity in new metric sinks in the code base) - but if your interested in making this available one option in the meantime would be creating a package and putting it on spark-packages.org (its what I've done for some my testing stuff personally). |
* Created by kapil on 1/12/15. | ||
*/ | ||
private[spark] class OpenTsdbSink(val property: Properties, val registry: MetricRegistry, | ||
securityMgr: SecurityManager) extends Sink with Logging { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It looks like you forgot to import org.apache.spark.SecurityManager and use inappropriate java.lang.SecurityManager.
MetricsSystem uses the former here (spark-core_2.10-1.4.1, org/apache/spark/metrics/MetricsSystem.scala:187):
.getConstructor(classOf[Properties], classOf[MetricRegistry], classOf[SecurityManager])
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I had fixed it in my local version. But due to inactivity from admins on this pull request, haven't really worked on it. Let me know if you have privileges to merge these changes, I can update the pull request.
@kapilsingh5050 could you close this PR please? I think this can live outside of Spark core as a separate repo, and could be listed in Spark packages. |
@holdenk It will be great to have such package. |
Closes apache#13114 Closes apache#10187 Closes apache#13432 Closes apache#13550
@jerryshao @russellcardullo please review