Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(basic-tracer): add recording Span implementation #102

Merged
merged 14 commits into from
Jul 29, 2019

Conversation

mayurkale22
Copy link
Member

@mayurkale22 mayurkale22 commented Jul 12, 2019

This PR contains real implementation of Span interface. Currently, this class is part of core package but, ultimately it should be shifted to basic-tracer package (dependency on #99).

*Note: tests are still pending, I will add once we agreed on the initial approach/implementation.

this._parentId = spanContext.spanId;
this._traceState = spanContext.traceState;
} else {
this._traceId = randomTraceId();
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this the span's responsibility?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@hekike what's your take on this?

packages/opentelemetry-core/src/trace/Span.ts Outdated Show resolved Hide resolved
packages/opentelemetry-core/src/trace/Span.ts Outdated Show resolved Hide resolved
@mayurkale22 mayurkale22 force-pushed the realSpan branch 2 times, most recently from e0f3a31 to 67e6c24 Compare July 13, 2019 19:51
Copy link
Contributor

@draffensperger draffensperger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why would't we want the base Span to be in the core package? Wouldn't the automatic tracers need the same Span as the basic one?

/** Constructs a new Span instance. */
constructor(
readonly _parentTracer: types.Tracer,
readonly _spanName: string,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you make this private readonly _spanName: string, to avoid needing the separate this._name = _spanName?

Not sure whether it's better called _name or _spanName

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

readonly is not possible as we have updateName method to update the span name.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, ok. Can you say then private _spanName so that you don't need a separate _name variable?

// This is a root span so no remote or local parent.
this._traceId = randomTraceId();
}
this._startTime = _options.startTime || performance.now();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sweet about use of performance.now! I'm excited with the thought of making the timestamps be monotonic using this, which will lead to higher accuracy timestamps and make things easier in the browser. We should document it though so people know what to expect - and we will need some utility functions down the line to make it easy for exporters to convert it.

And maybe we should confirm this decision in the SIG - was it fully decided that we will use performance.now for timestamps?

const spanContext: types.SpanContext = {
traceId: this._traceId,
spanId: this._spanId,
traceOptions: types.TraceOptions.SAMPLED,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why does this specify SAMPLED here?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also curious about this

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Was a little confuse here, Is it based on the sampling decision?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh hm I guess this depends on the tracer's startSpan implementation. The basictracer implementation creates a noop span if the span isn't sampled, so the real span implementation here is always sampled: https://github.com/open-telemetry/opentelemetry-js/pull/99/files#diff-b4b335d4beecdee2577ffca9f9b398c7R84

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@draffensperger Do you agree on this?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, that makes sense. Could you just add a comment explaining that the non-sampled Spans use the NoOp span implementation, so that this will always be sampled?

Alternatively, can we have this be sent by the tracer itself so we don't need to set it here but rather do it in the constructor?

try {
// If the value is a typeof object it has to be JSON.stringify-able
const serializedValue =
typeof value === 'object' ? JSON.stringify(value) : value;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this this stringify behavior be documented?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I actually think it should simply not happen. The tracer should accept any type and then it's up to the vendors to handle them. Serialization would happen in the default exporter.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we could make a shared defaultSerializer function that exporters could optionally use as well?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, especially since I expect most vendors to just use strings, but we support many different types on our end.

@mayurkale22
Copy link
Member Author

Why would't we want the base Span to be in the core package?

I think this is an implementation of Span API and it should be part of SDK (basic-tracer). That means vendors can write their own flavor of tracer/SDK based on standard sets of clean APIs defined in the core package. Let's discuss more about this in next SIG meeting.

Wouldn't the automatic tracers need the same Span as the basic one?

I think auto-tracer SDK is based on basic-tracer and asynchooks-scope-manager, we can wrap Span and Tracer implementation in automatic tracer from basic tracer package.

Copy link
Member

@bg451 bg451 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks like a good starting point

// @todo: record or export the span
}

isRecordingEvents(): boolean {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can the span be recording events if the span/trace isn't sampled? What if the span is sampled but isRecordingEvents was not set in the options?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the assumption is all the sampled span has isRecordingEvents flag as true.

Java's implementation: https://github.com/open-telemetry/opentelemetry-java/blob/master/sdk/src/main/java/io/opentelemetry/sdk/trace/SpanBuilderSdk.java#L179

const spanContext: types.SpanContext = {
traceId: this._traceId,
spanId: this._spanId,
traceOptions: types.TraceOptions.SAMPLED,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also curious about this

@mayurkale22 mayurkale22 force-pushed the realSpan branch 3 times, most recently from bfc54ac to 128061a Compare July 15, 2019 19:41
packages/opentelemetry-core/src/trace/Span.ts Outdated Show resolved Hide resolved
packages/opentelemetry-core/src/trace/Span.ts Outdated Show resolved Hide resolved
packages/opentelemetry-core/src/trace/Span.ts Outdated Show resolved Hide resolved
try {
// If the value is a typeof object it has to be JSON.stringify-able
const serializedValue =
typeof value === 'object' ? JSON.stringify(value) : value;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I actually think it should simply not happen. The tracer should accept any type and then it's up to the vendors to handle them. Serialization would happen in the default exporter.

@mayurkale22 mayurkale22 force-pushed the realSpan branch 3 times, most recently from bafec6c to 0661b73 Compare July 18, 2019 04:48
@mayurkale22
Copy link
Member Author

Why would't we want the base Span to be in the core package?

Based on SIG Discussion

  • Core is basic sets of APIs with Noop and default implementations.
  • Core is empty shelf
  • Other languages doing the same (API + SDK)
  • RealSpan: Probably it should live in basic-tracer

@mayurkale22 mayurkale22 changed the title Add Real Span feat(basic-tracer): add recording Span implementation Jul 25, 2019
@codecov-io
Copy link

codecov-io commented Jul 25, 2019

Codecov Report

Merging #102 into master will decrease coverage by 0.96%.
The diff coverage is 85.44%.

@@            Coverage Diff             @@
##           master     #102      +/-   ##
==========================================
- Coverage   98.74%   97.78%   -0.97%     
==========================================
  Files          29       31       +2     
  Lines        1915     2210     +295     
  Branches      221      262      +41     
==========================================
+ Hits         1891     2161     +270     
- Misses         24       49      +25
Impacted Files Coverage Δ
packages/opentelemetry-basic-tracer/src/Span.ts 85.44% <85.44%> (ø)
...ages/opentelemetry-basic-tracer/src/BasicTracer.ts 98.54% <0%> (ø)

packages/opentelemetry-basic-tracer/src/Span.ts Outdated Show resolved Hide resolved
packages/opentelemetry-basic-tracer/src/Span.ts Outdated Show resolved Hide resolved
packages/opentelemetry-basic-tracer/src/Span.ts Outdated Show resolved Hide resolved
packages/opentelemetry-basic-tracer/src/Span.ts Outdated Show resolved Hide resolved
packages/opentelemetry-basic-tracer/src/Span.ts Outdated Show resolved Hide resolved
packages/opentelemetry-basic-tracer/src/Span.ts Outdated Show resolved Hide resolved
- Log a warning when span is already ended.
- Validate spancontext
- Add serialization logic for setAttribute
- Add spancontext-utils
- Create SpanContext in constructor once
- Rename _getSpanContext to _getParentSpanContext
- Rename underscore prefix from params
- no-parameter-properties
private _name: string;
private _ended = false;
private _startTime: number;
private _endTime = 0;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Didn't we agree to use duration for Node specifically because it provides higher precision as a number than an end time?

Copy link
Contributor

@draffensperger draffensperger Jul 29, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you think that still applies now that we use performance.now for the timestamps (instead of epoch millis)?

It is true that Node processes that have been running for > ~90 days will begin to get accuracy slightly less than nanosecond accuracy for performance.now timestamps, but accuracy of 10 nanos still applies for performance.now timestamps even for processes that have been running for 2 years (try running 2*365*24*3600*1000 + 1e-5 in a console).

My opinion though is that given this relatively high accuracy even for long running jobs that we don't worry about the workaround to treat span duration differently from its start time or the timestamps of events within a span.

See #19 for some related discussion

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, given that we would be calculating duration as the different between performance.now timestamps, it's not totally clear to me that we would get higher accuracy that route. We'd need to use something like process.hrtime, which then requires wrapping it in a platform specific code.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Moved this conversation in a separate issue #144 to unblock this PR, will revisit again based on discussion.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

7 participants