Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-12194] [Spark Core] Sink for reporting metrics to OpenTSDB #10187

Closed

Conversation

kapilsingh5050
Copy link

@matvey14
Copy link

matvey14 commented Feb 8, 2016

Looks great Kapil. Ping @jerryshao @russellcardullo

@AmplabJenkins
Copy link

Can one of the admins verify this patch?

@holdenk
Copy link
Contributor

holdenk commented Apr 19, 2016

I don't know if theirs reviewer bandwidth for this change (I'm not a committer but just don't see a lot of activity in new metric sinks in the code base) - but if your interested in making this available one option in the meantime would be creating a package and putting it on spark-packages.org (its what I've done for some my testing stuff personally).

* Created by kapil on 1/12/15.
*/
private[spark] class OpenTsdbSink(val property: Properties, val registry: MetricRegistry,
securityMgr: SecurityManager) extends Sink with Logging {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like you forgot to import org.apache.spark.SecurityManager and use inappropriate java.lang.SecurityManager.

MetricsSystem uses the former here (spark-core_2.10-1.4.1, org/apache/spark/metrics/MetricsSystem.scala:187):
.getConstructor(classOf[Properties], classOf[MetricRegistry], classOf[SecurityManager])

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I had fixed it in my local version. But due to inactivity from admins on this pull request, haven't really worked on it. Let me know if you have privileges to merge these changes, I can update the pull request.

@MLnick
Copy link
Contributor

MLnick commented May 25, 2016

@kapilsingh5050 could you close this PR please? I think this can live outside of Spark core as a separate repo, and could be listed in Spark packages.

@alex-kolesnikov
Copy link

@holdenk It will be great to have such package.
What do you think about converting metrics' prefixes like app-xxxxx.x.executor to tags? It will help to aggregate metrics in tools like Grafana.

srowen added a commit to srowen/spark that referenced this pull request Jun 20, 2016
@asfgit asfgit closed this in 9251423 Jun 20, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
6 participants