Skip to content

PDP 31 (End to End Request Tags)

Derek Moore edited this page Jul 9, 2021 · 1 revision

Status: Under discussion.

Related issues: Issue 2803

Motivation

For the Pravega development team, finding the root cause of errors that appear in production environments is critical. Indeed, a mechanism to make debugging of logs easier for developers is important given that: i) it will allow us to increase software quality and robustness in Pravega, and ii) it will help us to understand problems in production environments of users.

In practice, errors typically manifest via the client, which provides little information about the root cause of the problem. A motivating challenge would be to attach ids or tags to client request that helps us tracking the activity of failed requests throughout the system, thus limiting the amount of logs to explore. Even more, automatic activity within the system, such as data retention or auto-scaling generate requests that would be also interesting to track end-to-end.

Goal

The goal of this PDP is to offer a mechanism to attach global tags to client requests that propagate through other Pravega components (i.e., Controller, Segment Store). Ideally, this will provide us with a clear picture of the causal relationship between operations across distributed processes in Pravega. Therefore, such global tags will enable us to rapidly track the lifecycle of a request end-to-end doing simple log inspection, thus facilitating debugging activities.

To illustrate our goal, let's describe an example. Let’s assume that a Pravega client executes a createStream operation. In principle, a global tag for that request should be propagated as follows. While issuing the request, the client creates TagX, logs that tag locally, and propagates the tag via the RPC request. Next, the Controller server receives the tagged RPC request and uses TagX for logging purposes internally, so we can keep track of the internal Controller operations associated with that tag. At some point, the Controller uses TagX to execute requests against the Segment Store to create the initial segments for the new stream. The Segment Store receives the createSegment requests along with the associated TagX via the "wire protocol", so we can keep track of the internal Segment Store operations associated with TagX. Finally, the responses from the Segment Store to the Controller, and from the Controller to the Client are also associated with TagX.

Requirements

Inter- & intra-component traceability: To really understand the lifecycle of a request end-to-end we do not only need to propagate request tags across Pravega components via the network, but also to keep track of the tasks associated to a request within a given component. Note that this may be non trivial, given the asynchronous execution model implemented by Pravega components.

Non-intrusiveness: While end-to-end request tags may improve our current debugging scheme, it seems clear that propagating tags across components (and within them) may involve important changes in the current code base depending on the implementation. For this reason, end-to-end request tags should be implemented with minimal impact on the current code base of Pravega.

Method

At the high level, to achieve our goals we need to complete the following subtasks:

  1. Propagate request tags internally throughout Controller/Segment Store async tasks.
  2. Attach request tags in RPC calls from clients to the controller (and back).
  3. Use wire commands to propagate request tags between clients and segment store.
  4. Async request tags management in Tier-2.

Propagating tags across async computations

To understand any potential issue during a debug session, we need to propagate a client-side request tag “within” each component (i.e., Controller, Segment Store). Given the asynchronous design of Pravega, this translates into finding a clean approach to associate requests tags to the async tasks (i.e., CompletableFuture) executed along a request’s lifecycle.

To address this problem, we propose to create a cache of traced client requests. In our current implementation, such a cache is provided by the RequestTracker class that contains a (guava) cache formed by <requestDescriptor, requestId> pairs:

  • requestDescriptor: A request descriptor is a string that will identify a client's request within a RequestTracker instance related to a Pravega component.
  • requestId: This field contains the client-generated id that has been propagated to identify a client request.

The main motivation to use a cache to trace Pravega requests within a component is that from any point in the code we could retrieve a client's requestId without adding it as a parameter in the method's signature. This is in line with the intra-component traceability and non-intrusiveness objectives mentioned before.

Technically, the requestDescriptor, which is the key to find the client-generated request id, is built with the local information available in the chain of async computations of a request's lifecycle. For instance, in the case of a createStream client request, the Controller may internally store the following in RequestTracker: <createStream-myScope-myStream, 12345>. That is, in the tracing cache of the Controller, we associate the descriptor createStream-myScope-myStream to the client request id 12345. By doing this, most of the async tasks in the Controller that participate in the createStream may access the client-generated id, given that they already have locally available myScope and myStream.

Limitation: However, this approach has a subtle limitation: it is hard to distinguish between concurrent requests with the same descriptor. That is, in the case of handling multiple createStream requests on the same stream, we could store them in RequestTracker, but it would not be possible to infer which one to retrieve in the case of a "get". It is expected that for a wide variety of scenarios collisions may be not common. Further, in those cases where collisions appear, RequestTracker applies the following policy: operations that are traced end-to-end are idempotent (i.e., only one request at a time may change the state of the system, despite multiple ones can be concurrently issued). Under this assumption, in the case of facing multiple request ids for the same descriptor, we retrieve the first one that was logged, which will represent both the request that is actually changing the state of the system as well as the related side-effects, such as the rejection of the rest of concurrent requests. This at least provides a consistent view of the end-to-end activity related to a request that changes the state of the system, which is the most interesting part for debug purposes.

Request tags in RPC calls

Our objective is to propagate client request tags or ids across Pravega components using the existing logging infrastructure (i.e., not adding additional dependencies for tracing). That is, we would need to:

  1. Generate global tags at the client side that represent the activity associated to a request. In Pravega, clients issue RPC requests to the Controller and locally create an id to trace that request. This can be observed in the following code snippet:
long traceId = LoggerHelpers.traceEnter(log, "createStream", streamConfig);
// execute RPC request
LoggerHelpers.traceLeave(log, "createStream", traceId);

Given that this pattern is adopted in all the client RPC requests, we can exploit such client-generated request ids or tags (i.e., traceId) to be propagated throughout the system. Note that such client-generated ids are not random, but based on the current time from a client's perspective.

  1. Propagate request ids to the Controller by attaching them in RPC headers. Once a client creates a request identifier, we need to attach it to the RPC request. Naturally, our objective is to keep the RPC protocol intact, so for this reason we make use of the request headers to transparently piggyback this information. In the following, there is an example of our proposal:
final CompletableFuture<CreateStreamStatus> result = this.retryConfig.runAsync(() -> {
            RPCAsyncCallback<CreateStreamStatus> callback = new RPCAsyncCallback<>(traceId, "createStream");
            addTagsToRequest(client, traceId, "createStream", streamConfig.getScope(), streamConfig.getStreamName())
                    .createStream(ModelHelper.decode(streamConfig), callback);
            return callback.getFuture();
        }, this.executor);

In the code snippet above, addTagsToRequest adds "options" or "tags" to the client stub that contain the requestDescriptor and the requestId. Once these options are set, we also provide a class called FishTaggingClientInterceptor that implements ClientInterceptor. Its purpose is to attach the client stub options (if any) as metadata headers of the RPC request. We set up the client interceptor in ControllerImpl as follows:

channelBuilder = channelBuilder.intercept(TracingHelpers.getClientInterceptor());
  1. Intercepting requests for extracting the tags from RPC requests at the server side. On the other end, the Controller requires a clean mechanism to extract client-generated request ids from RPC requests. For this reason, we provide a class named FishTaggingServerInterceptor that implements ServerInterceptor. This class is intended to get the client-generated tags available in the metadata headers of the request (if any) and add them to the RequestTracker instance in the Controller.

Finally, we intercept the RPC server (GRPCServer) as follows:

ServerBuilder<?> builder = ServerBuilder
                .forPort(serverConfig.getPort())
                .addService(ServerInterceptors.intercept(new ControllerServiceImpl(controllerService, authHelper, serverConfig.isReplyWithStackTraceOnError()), TracingHelpers.getServerInterceptor()));
  1. Attach client-generated request ids to controller events. In Pravega, the Controller has been designed to be stateless and handle failures. To this end, many client requests against the Controller are not immediately executed, but result in a ControllerEvent representing the action that should be executed. Note that such ControllerEvents are persisted in an internal Pravega stream and consumed by the same instance that created it or any other one. This yields that if a Controller instance crashes, another one may take over, consume the ControllerEvent and finalize an interrupted operation. However, we need to propagate the requestId from the original client request to all the classes extending ControllerEvent that are important to trace. At the moment, we added the requestId field to AbortEvent, CommitEvent, DeleteStreamEvent, SealStreamEvent, TruncateStreamEvent, UpdateStreamEvent and ScaleOpEvent (in the latter case, the field scaleTime is an equivalent for requestId).

Use WireCommands to propagate request tags

A client RPC request that reaches the Controller may have also effects on the Segment Store. That is, retaking our initial example, the createStream operation makes the Controller to create a set of initial segments at the Segment Store. Therefore, to understand the end-to-end lifecycle of a request, we need to propagate the request tags via the WireCommands used by the Segment Store protocol.

Most of the WireCommands already have a requestId attribute. A plausible option could be to re-interpret the way this attribute is used today and associate them with an end-to-end tag instead. By doing this, we avoid changes in WireCommands to propagate request tags if tags are of type long.

Concretely, SegmentHelpers is an internal Controller class that contains the logic to perform operations against the Segment Store. That class previously generated its own identifiers to track requests. Instead, now this class accepts request ids to be passed by parameter, which are now related to the client requestId. That is, the execution of createStream implies to execute one or many createSegment request from the Controller to the Segment Store:

public CompletableFuture<Boolean> createSegment(final String scope,
                                                    final String stream,
                                                    final long segmentId,
                                                    final ScalingPolicy policy,
                                                    final HostControllerStore hostControllerStore,
                                                    final ConnectionFactory clientCF,
                                                    String controllerToken,
                                                    final long requestId) {
// Some code
WireCommands.CreateSegment request = new WireCommands.CreateSegment(requestId, qualifiedStreamSegmentName,
                extracted.getLeft(), extracted.getRight(), controllerToken);

Now, all these requests will be associated with the client requestId, so we will be able to track end-to-end all the activity involved in the createStream request.

Async request tag management in Tier-2

The last stage of most operations traced end-to-end is Tier-2 in the Segment Store. One of the important points to take into account is that Tier-2 activity is asynchronous. In practice, this yields that a client executing an operation against Pravega, such as createStream, receives confirmation before the actual segments for that stream are created in Tier-2. Therefore, the Segment Store should still keep the request tag for a client after the request is served.

Concretely, the part in the Segment Store where request tags are managed to be reflected in Tier-2 is RollingStorage class. The reason for this decision is that RollingStorage provides a layer on top of Tier-2 implementations, but still gives us the opportunity to keep track of Tier-2 activity in depth. That is, in each of the main methods related to the lifecycle of segments (create, seal, concat, delete, truncate) we get the request tags associated to a client request, we add logs to track Tier-2 activity, and then we remove the request tags from RequestTracker.

As an implementation note, it is important to take into account that a single segment in Tier-2 is not represented by a single file, but by few of them (header, state, attributes). For this reason, when a new operation, such as create or delete, is issued against a segment, we also create new request tags for the operations performed on all the auxiliary files that contain information from a segment, so we can track the Tier-2 related to them as well.

End-to-end traced operations and processes

In the following, we provide an overview of the current operations that have been traced end-to-end, as well as the points in which we access and log the client-generated request id:

Client requests

createScope

Client: ControllerImpl:createScope

Controller: ControllerServiceImpl:createScope

Segment Store: n/a

deleteScope

Client: ControllerImpl:createScope

Controller: ControllerServiceImpl:deleteScope

Segment Store: n/a

createStream

Client: ControllerImpl:createStream

Controller: ControllerServiceImpl:createStream StreamMetadataTasks:createStream SegmentHelper:createSegment

Segment Store: PravegaRequestProcessor:createSegment StreamSegmentContainer:createStreamSegment RollingStorage:create

updateStream

Client: ControllerImpl:updateStream

Controller: ControllerServiceImpl:updateStream StreamMetadataTasks:updateStream UpdateStreamTask:execute SegmentHelper:updatePolicy

Segment Store: PravegaRequestProcessor:updateSegmentPolicy

truncateStream

Client: ControllerImpl:truncateStream

Controller: ControllerServiceImpl:truncateStream StreamMetadataTasks:truncateStream TruncateStreamTask:execute SegmentHelper:truncateSegment SegmentHelper:deleteSegment

Segment Store: PravegaRequestProcessor:truncateSegment PravegaRequestProcessor:deleteSegment StreamSegmentContainer:truncateStreamSegment StreamSegmentContainer:deleteStreamSegment RollingStorage:truncate RollingStorage:delete

scaleStream

Client: ControllerImpl:scaleStream ControllerImpl:startScale

Controller: ControllerServiceImpl:scale StreamMetadataTasks:manualScale ScaleOperationTask:execute SegmentHelper:createSegment SegmentHelper:sealSegment

Segment Store: PravegaRequestProcessor:createSegment PravegaRequestProcessor:sealSegment StreamSegmentContainer:createStreamSegment StreamSegmentContainer:sealStreamSegment RollingStorage:create RollingStorage:seal

sealStream

Client: ControllerImpl:sealStream

Controller: ControllerServiceImpl:sealStream StreamMetadataTasks:sealStream SealStreamTask:execute SegmentHelper:sealSegment

Segment Store: PravegaRequestProcessor:sealSegment StreamSegmentContainer:sealStreamSegment RollingStorage:seal

deleteStream

Client: ControllerImpl:deleteStream

Controller: ControllerServiceImpl:deleteStream StreamMetadataTasks:deleteStream DeleteStreamTask:execute SegmentHelper:deleteSegment

Segment Store: PravegaRequestProcessor:deleteSegment StreamSegmentContainer:deleteStreamSegment RollingStorage:delete

commitTransaction

Client: ControllerImpl:commmitTransaction

Controller: ControllerServiceImpl:commitTransaction StreamTransactionMetadataTasks:commitTxn CommitRequestHandler:execute SegmentHelper:createSegment SegmentHelper:sealSegment SegmentHelper:commitTransaction

Segment Store: PravegaRequestProcessor:mergeSegments PravegaRequestProcessor:createSegment PravegaRequestProcessor:sealSegment StreamSegmentContainer:mergeStreamSegment StreamSegmentContainer:createStreamSegment StreamSegmentContainer:sealStreamSegment RollingStorage:create RollingStorage:seal RollingStorage:concat

abortTransaction

Client: ControllerImpl:abortTransaction

Controller: ControllerServiceImpl:abortTransaction StreamTransactionMetadataTasks:abortTxn AbortRequestHandler:processEvent SegmentHelper:abortTransaction

Segment Store: PravegaRequestProcessor:deleteSegment StreamSegmentContainer:deleteStreamSegment RollingStorage:delete

Automated processes

autoScaling

Pravega Segment Stores have the ability to notify the Controller to orchestrate the split/merge a particular segment depending on whether that segment is subject to higher/lower load than the value specified in the stream scaling policy (i.e., auto-scaling). From a debugging perspective, we are also interested to identify the lifecycle of such requests across components via a request identifier.

To this end, we will use the local timestamp of Segment Stores as request identifiers for scaling events (as we already do with client timestamps in manual scaling requests). Thus, the AutoScaleEvent contains a timestamp field that the AutoScaleTask will interpret as requestId, thus keeping it to build the ScaleOpEvent that will trigger the scale request. ScaleOperationTask also interprets timestamp as the id for the scale request, and therefore this is the value given to the interactions against the Segment Store (SegmentHelpers). In this case, no interaction with RequestTracker is needed as the request id is embedded within the events sent from the Segment Store to the Controller (AutoScaleEvent), as well as from the Controller to the Controller (ScaleOpEvent).

retention

Pravega provides users with the ability to configure a Stream with a certain period of data retention (i.e., time-based, size-based). Pravega checks periodically (StreamCutBucketService) whether or not the oldest data of a stream exceeds the retention limit defined. In the affirmative case, StreamCutBucketService triggers a truncation operation on the stream with the appropriate offset. We are also interested on tracing this type of automatic retention activity.

To trace those operations, we first focus on StreamCutBucketService as the initiator of the retention activity. Concretely, in the performRetention method we create the requestId for the truncation request, which is in turn the timestamp used in this kind of requests to avoid changing method signatures (similarly to the the autoScaling case). The retention request reaches StreamMetadataTasks, which creates a TruncateStreamEvent providing as requestId the timestamp of the request. From this point onward, the requestId is persisted in the event and used by TruncateStreamTask to interact with the Segment Store (SegmentHelpers). Note that we do not use the RequestTracker class to trace these operations.

Interpretation of logs

Next, we want to give a sense on the practical results of tracing end-to-end operations. Concretely, we provide the interpretation of executing a simple log inspection (e.g., grep on multiple log files) for various operations.

createStream

Interpretation of logs after executing grep on request id 3173897036445:

First, the client creates and logs the request id and attaches it to the RPC request:

**client**:2018-10-10 07:44:30,484 1857 [Time-limited test] TRACE i.p.c.stream.impl.ControllerImpl - ENTER createStream@3173897036445 [StreamConfiguration(scope=testReadWriteAndAutoScaleScope972459555, streamName=testReadWriteAndAutoScaleStream, scalingPolicy=ScalingPolicy(scaleType=BY_RATE_IN_EVENTS_PER_SEC, targetRate=1, scaleFactor=2, minNumSegments=2), retentionPolicy=null)].
**client**:2018-10-10 07:44:30,492 1865 [StreamManager-Controller-1] INFO  i.p.common.tracing.TracingHelpers - [requestId=3173897036445] Tagging RPC request createStream-testReadWriteAndAutoScaleScope972459555-testReadWriteAndAutoScaleStream.
**client**:2018-10-10 07:44:30,493 1866 [grpc-default-worker-ELG-1-2] DEBUG io.grpc.netty.NettyClientHandler - [id: 0x3b56b1e4, L:/172.17.0.2:33974 - R:172.16.18.21/172.16.18.21:9092] OUTBOUND HEADERS: streamId=5 headers=GrpcHttp2OutboundHeaders[:authority: 172.16.18.24:9092,172.16.18.25:9092,172.16.18.21:9092, :path: /io.pravega.controller.stream.api.grpc.v1.ControllerService/createStream, :method: POST, :scheme: http, content-type: application/grpc, te: trailers, user-agent: grpc-java-netty, requestdescriptor: createStream-testReadWriteAndAutoScaleScope972459555-testReadWriteAndAutoScaleStream, requestid: 3173897036445, grpc-accept-encoding: gzip, grpc-trace-bin: ] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false

The Controller receives the request and adds the client request id to the RequestTracker:

**controller**:2018-10-10 07:44:30,496 233025 [grpc-default-worker-ELG-4-7] DEBUG io.grpc.netty.NettyServerHandler - [id: 0x151dfbe5, L:/172.16.18.21:9092 - R:/172.16.18.25:33974] INBOUND HEADERS: streamId=5 headers=GrpcHttp2RequestHeaders[:path: /io.pravega.controller.stream.api.grpc.v1.ControllerService/createStream, :authority: 172.16.18.24:9092,172.16.18.25:9092,172.16.18.21:9092, :method: POST, :scheme: http, te: trailers, content-type: application/grpc, user-agent: grpc-java-netty, requestdescriptor: createStream-testReadWriteAndAutoScaleScope972459555-testReadWriteAndAutoScaleStream, requestid: 3173897036445, grpc-accept-encoding: gzip, grpc-trace-bin: ] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false
**controller**:2018-10-10 07:44:30,497 233026 [grpc-default-executor-2] INFO  i.p.common.tracing.RequestTracker - Tracking request createStream-testReadWriteAndAutoScaleScope972459555-testReadWriteAndAutoScaleStream with id 3173897036445. Current ongoing requests: 7.
**controller**:2018-10-10 07:44:30,497 233026 [grpc-default-executor-2] INFO  i.p.common.tracing.TracingHelpers - [requestId=3173897036445] Received tag from RPC request createStream-testReadWriteAndAutoScaleScope972459555-testReadWriteAndAutoScaleStream.
**controller**:2018-10-10 07:44:30,501 233030 [grpc-default-executor-2] INFO  i.p.c.s.r.g.v.ControllerServiceImpl - [requestId=3173897036445] createStream called for stream testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream.
**controller**:2018-10-10 07:44:30,598 233127 [controllerpool-12] INFO  i.p.c.t.Stream.StreamMetadataTasks - [requestId=3173897036445] testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream created in metadata store

The Controller requests the Segment Store to create segment testReadWriteAndAutoScaleStream/1.#epoch.0:

**controller**:2018-10-10 07:44:30,863 233392 [epollEventLoopGroup-2-7] DEBUG i.p.c.n.i.ClientConnectionInboundHandler - 172.16.18.22 processing reply: WireCommands.SegmentCreated(type=SEGMENT_CREATED, requestId=3173897036445, segment=testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/1.#epoch.0)
**controller**:2018-10-10 07:44:30,864 233393 [epollEventLoopGroup-2-7] INFO  i.p.controller.server.SegmentHelper - [requestId=3173897036445] CreateSegment testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/1.#epoch.0 SegmentCreated
**controller**:2018-10-10 07:44:30,865 233394 [epollEventLoopGroup-2-7] WARN  i.p.controller.server.SegmentHelper - [requestId=3173897036445] CreateSegment testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/1.#epoch.0 Connection dropped

One of the Segment Store server receives and processes the request (also in Tier-2):

**segmentstore-1**:2018-10-10 07:44:30,615 157754 [epollEventLoopGroup-3-6] DEBUG i.p.s.s.h.h.ServerConnectionInboundHandler - Processing request: WireCommands.CreateSegment(type=CREATE_SEGMENT, requestId=3173897036445, segment=testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/1.#epoch.0, scaleType=2, targetRate=1, delegationToken=)
**segmentstore-1**:2018-10-10 07:44:30,615 157754 [epollEventLoopGroup-3-6] INFO  i.p.common.tracing.RequestTracker - Tags not found for this request: requestId=3173897036445, descriptor=createSegment-testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/1.#epoch.0. Create request tag at this point.
**segmentstore-1**:2018-10-10 07:44:30,616 157755 [epollEventLoopGroup-3-6] INFO  i.p.common.tracing.RequestTracker - Tracking request createSegment-testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/1.#epoch.0 with id 3173897036445. Current ongoing requests: 1.
**segmentstore-1**:2018-10-10 07:44:30,616 157755 [epollEventLoopGroup-3-6] INFO  i.p.s.s.h.h.PravegaRequestProcessor - [requestId=3173897036445] Creating stream segment WireCommands.CreateSegment(type=CREATE_SEGMENT, requestId=3173897036445, segment=testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/1.#epoch.0, scaleType=2, targetRate=1, delegationToken=).
**segmentstore-1**:2018-10-10 07:44:30,616 157755 [epollEventLoopGroup-3-6] DEBUG i.p.s.s.c.StreamSegmentContainer - [requestId=3173897036445] SegmentContainer[3]: createStreamSegment [testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/1.#epoch.0]
**segmentstore-1**:2018-10-10 07:44:30,616 157755 [storage-io-13] INFO  i.p.common.tracing.RequestTracker - Tracking request createSegment-testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/1.#epoch.0$state with id 3173897036445. Current ongoing requests: 2.
**segmentstore-1**:2018-10-10 07:44:30,617 157756 [storage-io-13] INFO  i.p.common.tracing.RequestTracker - Tracking request createSegment-testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/1.#epoch.0$attributes with id 3173897036445. Current ongoing requests: 3.
**segmentstore-1**:2018-10-10 07:44:30,690 157829 [storage-io-13] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=3173897036445] Header for 'testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/1.#epoch.0' fully serialized to 'testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/1.#epoch.0$header'.
**segmentstore-1**:2018-10-10 07:44:30,691 157830 [storage-io-13] INFO  i.p.common.tracing.RequestTracker - Untracking request createSegment-testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/1.#epoch.0 with id [3173897036445]. Current ongoing requests: 2.
**segmentstore-1**:2018-10-10 07:44:30,754 157893 [storage-io-18] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=3173897036445] Header for 'testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/1.#epoch.0$state' fully serialized to 'testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/1.#epoch.0$state$header'.
**segmentstore-1**:2018-10-10 07:44:30,755 157894 [storage-io-18] INFO  i.p.common.tracing.RequestTracker - Untracking request createSegment-testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/1.#epoch.0$state with id [3173897036445]. Current ongoing requests: 1.
**segmentstore-1**:2018-10-10 07:44:32,281 159420 [storage-io-9] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=3173897036445] Header for 'testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/1.#epoch.0$attributes' fully serialized to 'testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/1.#epoch.0$attributes$header'.
**segmentstore-1**:2018-10-10 07:44:32,281 159420 [storage-io-9] INFO  i.p.common.tracing.RequestTracker - Untracking request createSegment-testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/1.#epoch.0$attributes with id [3173897036445]. Current ongoing requests: 0.

The Controller requests the Segment Store to create segment testReadWriteAndAutoScaleStream/0.#epoch.0:

**controller**:2018-10-10 07:44:31,781 234310 [epollEventLoopGroup-2-8] DEBUG i.p.c.n.i.ClientConnectionInboundHandler - 172.16.18.24 processing reply: WireCommands.SegmentCreated(type=SEGMENT_CREATED, requestId=3173897036445, segment=testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/0.#epoch.0)
**controller**:2018-10-10 07:44:31,781 234310 [epollEventLoopGroup-2-8] INFO  i.p.controller.server.SegmentHelper - [requestId=3173897036445] CreateSegment testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/0.#epoch.0 SegmentCreated
**controller**:2018-10-10 07:44:31,787 234316 [epollEventLoopGroup-2-8] WARN  i.p.controller.server.SegmentHelper - [requestId=3173897036445] CreateSegment testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/0.#epoch.0 Connection dropped

Another Segment Store server receives and processes the request (also in Tier-2):

**segmentstore-2**:2018-10-10 07:44:30,621 95792 [epollEventLoopGroup-3-1] DEBUG i.p.s.s.h.h.ServerConnectionInboundHandler - Processing request: WireCommands.CreateSegment(type=CREATE_SEGMENT, requestId=3173897036445, segment=testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/0.#epoch.0, scaleType=2, targetRate=1, delegationToken=)
**segmentstore-2**:2018-10-10 07:44:30,627 95798 [epollEventLoopGroup-3-1] INFO  i.p.common.tracing.RequestTracker - Tags not found for this request: requestId=3173897036445, descriptor=createSegment-testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/0.#epoch.0. Create request tag at this point.
**segmentstore-2**:2018-10-10 07:44:30,634 95805 [epollEventLoopGroup-3-1] INFO  i.p.common.tracing.RequestTracker - Tracking request createSegment-testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/0.#epoch.0 with id 3173897036445. Current ongoing requests: 1.
**segmentstore-2**:2018-10-10 07:44:30,634 95805 [epollEventLoopGroup-3-1] INFO  i.p.s.s.h.h.PravegaRequestProcessor - [requestId=3173897036445] Creating stream segment WireCommands.CreateSegment(type=CREATE_SEGMENT, requestId=3173897036445, segment=testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/0.#epoch.0, scaleType=2, targetRate=1, delegationToken=).
**segmentstore-2**:2018-10-10 07:44:30,636 95807 [epollEventLoopGroup-3-1] DEBUG i.p.s.s.c.StreamSegmentContainer - [requestId=3173897036445] SegmentContainer[0]: createStreamSegment [testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/0.#epoch.0]
**segmentstore-2**:2018-10-10 07:44:30,639 95810 [storage-io-6] INFO  i.p.common.tracing.RequestTracker - Tracking request createSegment-testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/0.#epoch.0$state with id 3173897036445. Current ongoing requests: 2.
**segmentstore-2**:2018-10-10 07:44:30,640 95811 [storage-io-6] INFO  i.p.common.tracing.RequestTracker - Tracking request createSegment-testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/0.#epoch.0$attributes with id 3173897036445. Current ongoing requests: 3.
**segmentstore-2**:2018-10-10 07:44:30,752 95923 [storage-io-6] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=3173897036445] Header for 'testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/0.#epoch.0' fully serialized to 'testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/0.#epoch.0$header'.
**segmentstore-2**:2018-10-10 07:44:30,755 95926 [storage-io-6] INFO  i.p.common.tracing.RequestTracker - Untracking request createSegment-testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/0.#epoch.0 with id [3173897036445]. Current ongoing requests: 2.
**segmentstore-2**:2018-10-10 07:44:31,245 96416 [storage-io-7] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=3173897036445] Header for 'testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/0.#epoch.0$state' fully serialized to 'testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/0.#epoch.0$state$header'.
**segmentstore-2**:2018-10-10 07:44:31,245 96416 [storage-io-7] INFO  i.p.common.tracing.RequestTracker - Untracking request createSegment-testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/0.#epoch.0$state with id [3173897036445]. Current ongoing requests: 1.
**segmentstore-2**:2018-10-10 07:44:32,379 97550 [storage-io-7] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=3173897036445] Header for 'testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/0.#epoch.0$attributes' fully serialized to 'testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/0.#epoch.0$attributes$header'.
**segmentstore-2**:2018-10-10 07:44:32,379 97550 [storage-io-7] INFO  i.p.common.tracing.RequestTracker - Untracking request createSegment-testReadWriteAndAutoScaleScope972459555/testReadWriteAndAutoScaleStream/0.#epoch.0$attributes with id [3173897036445]. Current ongoing requests: 1.

The Controller untracks the request from RequestTracker and replies to the client:

**controller**:2018-10-10 07:44:31,810 234339 [controllerpool-69] INFO  i.p.common.tracing.RequestTracker - Untracking request createStream-testReadWriteAndAutoScaleScope972459555-testReadWriteAndAutoScaleStream with id [3173897036445]. Current ongoing requests: 6.
**controller**:2018-10-10 07:44:31,810 234339 [controllerpool-69] INFO  i.p.c.s.r.g.v.ControllerServiceImpl - [requestId=3173897036445] Untracking request: createStream-testReadWriteAndAutoScaleScope972459555-testReadWriteAndAutoScaleStream.

The client receives the confirmation that the stream has been correctly created:

**client**:2018-10-10 07:44:31,814 3187 [grpc-default-executor-0] INFO  i.p.c.stream.impl.ControllerImpl - [requestId=3173897036445] Stream created successfully: testReadWriteAndAutoScaleStream
**client**:2018-10-10 07:44:31,814 3187 [grpc-default-executor-0] TRACE i.p.c.stream.impl.ControllerImpl - LEAVE createStream@3173897036445 (elapsed=1330839us).

sealStream

Interpretation of logs after executing grep on request id 4112048762987:

First, the client creates and logs the request id and attaches it to the RPC request:

**client**:2018-10-10 08:00:17,567 4466 [Time-limited test] TRACE i.p.c.stream.impl.ControllerImpl - ENTER sealStream@4112048762987 [testStreamsAndScopesManagement0, 1].
**client**:2018-10-10 08:00:17,568 4467 [StreamManager-Controller-1] INFO  i.p.common.tracing.TracingHelpers - [requestId=4112048762987] Tagging RPC request sealStream-testStreamsAndScopesManagement0-1.
**client**:2018-10-10 08:00:17,570 4469 [grpc-default-worker-ELG-1-2] DEBUG io.grpc.netty.NettyClientHandler - [id: 0xbdd41b34, L:/172.17.0.2:57900 - R:172.16.18.24/172.16.18.24:9092] OUTBOUND HEADERS: streamId=19 headers=GrpcHttp2OutboundHeaders[:authority: 172.16.18.24:9092, :path: /io.pravega.controller.stream.api.grpc.v1.ControllerService/sealStream, :method: POST, :scheme: http, content-type: application/grpc, te: trailers, user-agent: grpc-java-netty, requestdescriptor: sealStream-testStreamsAndScopesManagement0-1, requestid: 4112048762987, grpc-accept-encoding: gzip, grpc-trace-bin: ] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false

The Controller receives the request and adds the client request id to the RequestTracker:

**controller**:2018-10-10 08:00:17,573 1242044 [grpc-default-worker-ELG-4-7] DEBUG io.grpc.netty.NettyServerHandler - [id: 0x6e96454f, L:/172.16.18.24:9092 - R:/172.16.18.23:57900] INBOUND HEADERS: streamId=19 headers=GrpcHttp2RequestHeaders[:path: /io.pravega.controller.stream.api.grpc.v1.ControllerService/sealStream, :authority: 172.16.18.24:9092, :method: POST, :scheme: http, te: trailers, content-type: application/grpc, user-agent: grpc-java-netty, requestdescriptor: sealStream-testStreamsAndScopesManagement0-1, requestid: 4112048762987, grpc-accept-encoding: gzip, grpc-trace-bin: ] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false
**controller**:2018-10-10 08:00:17,573 1242044 [grpc-default-executor-40] INFO  i.p.common.tracing.RequestTracker - Tracking request sealStream-testStreamsAndScopesManagement0-1 with id 4112048762987. Current ongoing requests: 1.
**controller**:2018-10-10 08:00:17,573 1242044 [grpc-default-executor-40] INFO  i.p.common.tracing.TracingHelpers - [requestId=4112048762987] Received tag from RPC request sealStream-testStreamsAndScopesManagement0-1.
**controller**:2018-10-10 08:00:17,573 1242044 [grpc-default-executor-40] INFO  i.p.c.s.r.g.v.ControllerServiceImpl - [requestId=4112048762987] sealStream called for stream testStreamsAndScopesManagement0/1.

A SealStreamTask takes over the request and processes it, including the operations against the Segment Store:

**controller**:2018-10-10 08:00:18,471 1242942 [controllerpool-5] DEBUG i.p.c.s.e.r.SealStreamTask - [requestId=4112048762987] Sending notification to segment store to seal segments for stream testStreamsAndScopesManagement0/1
**controller**:2018-10-10 08:00:18,483 1242954 [epollEventLoopGroup-2-8] DEBUG i.p.c.n.i.ClientConnectionInboundHandler - 172.16.18.22 processing reply: WireCommands.SegmentSealed(type=SEGMENT_SEALED, requestId=4112048762987, segment=testStreamsAndScopesManagement0/1/0.#epoch.0)
**controller**:2018-10-10 08:00:18,484 1242955 [epollEventLoopGroup-2-8] INFO  i.p.controller.server.SegmentHelper - [requestId=4112048762987] sealSegment testStreamsAndScopesManagement0/1/0.#epoch.0 segmentSealed
**controller**:2018-10-10 08:00:18,484 1242955 [epollEventLoopGroup-2-8] WARN  i.p.controller.server.SegmentHelper - [requestId=4112048762987] sealSegment testStreamsAndScopesManagement0/1/0.#epoch.0 connectionDropped

A Segment Store server receives and processes the request (also in Tier-2):

**segmentstore-1**:2018-10-10 08:00:18,475 1105614 [epollEventLoopGroup-3-2] DEBUG i.p.s.s.h.h.ServerConnectionInboundHandler - Processing request: WireCommands.SealSegment(type=SEAL_SEGMENT, requestId=4112048762987, segment=testStreamsAndScopesManagement0/1/0.#epoch.0, delegationToken=)
**segmentstore-1**:2018-10-10 08:00:18,475 1105614 [epollEventLoopGroup-3-2] INFO  i.p.common.tracing.RequestTracker - Tags not found for this request: requestId=4112048762987, descriptor=sealSegment-testStreamsAndScopesManagement0/1/0.#epoch.0. Create request tag at this point.
**segmentstore-1**:2018-10-10 08:00:18,475 1105614 [epollEventLoopGroup-3-2] INFO  i.p.common.tracing.RequestTracker - Tracking request sealSegment-testStreamsAndScopesManagement0/1/0.#epoch.0 with id 4112048762987. Current ongoing requests: 7.
**segmentstore-1**:2018-10-10 08:00:18,475 1105614 [epollEventLoopGroup-3-2] INFO  i.p.s.s.h.h.PravegaRequestProcessor - [requestId=4112048762987] Sealing segment WireCommands.SealSegment(type=SEAL_SEGMENT, requestId=4112048762987, segment=testStreamsAndScopesManagement0/1/0.#epoch.0, delegationToken=)
**segmentstore-1**:2018-10-10 08:00:18,475 1105614 [epollEventLoopGroup-3-2] DEBUG i.p.s.s.c.StreamSegmentContainer - [requestId=4112048762987] SegmentContainer[0]: sealStreamSegment [testStreamsAndScopesManagement0/1/0.#epoch.0]
**segmentstore-1**:2018-10-10 08:00:18,483 1105622 [storage-io-18] INFO  i.p.common.tracing.RequestTracker - Tracking request sealSegment-testStreamsAndScopesManagement0/1/0.#epoch.0$state with id 4112048762987. Current ongoing requests: 8.
**segmentstore-1**:2018-10-10 08:00:18,483 1105622 [storage-io-18] INFO  i.p.common.tracing.RequestTracker - Tracking request sealSegment-testStreamsAndScopesManagement0/1/0.#epoch.0$attributes with id 4112048762987. Current ongoing requests: 9.
**segmentstore-1**:2018-10-10 08:00:18,493 1105632 [storage-io-18] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=4112048762987] Sealed Header for 'testStreamsAndScopesManagement0/1/0.#epoch.0'.
**segmentstore-1**:2018-10-10 08:00:18,493 1105632 [storage-io-18] INFO  i.p.common.tracing.RequestTracker - Untracking request sealSegment-testStreamsAndScopesManagement0/1/0.#epoch.0 with id [4112048762987]. Current ongoing requests: 8.
**segmentstore-1**:2018-10-10 08:00:18,597 1105736 [storage-io-11] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=4112048762987] Sealed Header for 'testStreamsAndScopesManagement0/1/0.#epoch.0$attributes'.
**segmentstore-1**:2018-10-10 08:00:18,597 1105736 [storage-io-11] INFO  i.p.common.tracing.RequestTracker - Untracking request sealSegment-testStreamsAndScopesManagement0/1/0.#epoch.0$attributes with id [4112048762987]. Current ongoing requests: 8.
**segmentstore-1**:2018-10-10 08:00:18,627 1105766 [storage-io-17] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=4112048762987] Sealed active SegmentChunk 'testStreamsAndScopesManagement0/1/0.#epoch.0$state$offset.0' for 'testStreamsAndScopesManagement0/1/0.#epoch.0$state'.
**segmentstore-1**:2018-10-10 08:00:18,634 1105773 [storage-io-17] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=4112048762987] Sealed Header for 'testStreamsAndScopesManagement0/1/0.#epoch.0$state'.
**segmentstore-1**:2018-10-10 08:00:18,634 1105773 [storage-io-17] INFO  i.p.common.tracing.RequestTracker - Untracking request sealSegment-testStreamsAndScopesManagement0/1/0.#epoch.0$state with id [4112048762987]. Current ongoing requests: 6.

The Controller untracks the request from RequestTracker and replies to the client:

**controller**:2018-10-10 08:00:18,507 1242978 [controllerpool-64] INFO  i.p.common.tracing.RequestTracker - Untracking request sealStream-testStreamsAndScopesManagement0-1 with id [4112048762987]. Current ongoing requests: 0.
**controller**:2018-10-10 08:00:18,507 1242978 [controllerpool-64] INFO  i.p.c.s.r.g.v.ControllerServiceImpl - [requestId=4112048762987] Untracking request: sealStream-testStreamsAndScopesManagement0-1.

The client receives the confirmation that the stream has been correctly sealed:

**client**:2018-10-10 08:00:18,510 5409 [grpc-default-executor-0] INFO  i.p.c.stream.impl.ControllerImpl - [requestId=4112048762987] Successfully sealed stream: 1
**client**:2018-10-10 08:00:18,510 5409 [grpc-default-executor-0] TRACE i.p.c.stream.impl.ControllerImpl - LEAVE sealStream@4112048762987 (elapsed=942821us).

deleteStream

Interpretation of logs after executing grep on request id 4159074315044:

First, the client creates and logs the request id and attaches it to the RPC request:

**client**:2018-10-10 17:24:05,605 2983 [Time-limited test] TRACE i.p.c.stream.impl.ControllerImpl - ENTER deleteStream@4159074315044 [testStreamsAndScopesManagement0, 1].
**client**:2018-10-10 17:24:05,605 2983 [StreamManager-Controller-1] INFO  i.p.c.tracing.RPCTracingHelpers - [requestId=4159074315044] Tagging RPC request deleteStream-testStreamsAndScopesManagement0-1. 
**client**:2018-10-10 17:24:05,606 2984 [grpc-default-worker-ELG-1-2] DEBUG io.grpc.netty.NettyClientHandler - [id: 0xdd22bea5, L:/172.17.0.2:38080 - R:172.16.18.24/172.16.18.24:9092] OUTBOUND HEADERS: streamId=21 headers=GrpcHttp2OutboundHeaders[:authority: 172.16.18.24:9092, :path: /io.pravega.controller.stream.api.grpc.v1.ControllerService/deleteStream, :method: POST, :scheme: http, content-type: application/grpc, te: trailers, user-agent: grpc-java-netty, requestdescriptor: deleteStream-testStreamsAndScopesManagement0-1, requestid: 4159074315044, grpc-accept-encoding: gzip, grpc-trace-bin: ] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false

The Controller receives the request and adds the client request id to the RequestTracker:

**controller**:2018-10-10 17:24:05,607 1130760 [grpc-default-worker-ELG-4-5] DEBUG io.grpc.netty.NettyServerHandler - [id: 0xe95a6412, L:/172.16.18.24:9092 - R:/172.16.18.22:38080] INBOUND HEADERS: streamId=21 headers=GrpcHttp2RequestHeaders[:path: /io.pravega.controller.stream.api.grpc.v1.ControllerService/deleteStream, :authority: 172.16.18.24:9092, :method: POST, :scheme: http, te: trailers, content-type: application/grpc, user-agent: grpc-java-netty, requestdescriptor: deleteStream-testStreamsAndScopesManagement0-1, requestid: 4159074315044, grpc-accept-encoding: gzip, grpc-trace-bin: ] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false
**controller**:2018-10-10 17:24:05,608 1130761 [grpc-default-executor-18] DEBUG i.p.common.tracing.RequestTracker - Tracking request deleteStream-testStreamsAndScopesManagement0-1 with id 4159074315044. Current ongoing requests: 1.
**controller**:2018-10-10 17:24:05,608 1130761 [grpc-default-executor-18] INFO  i.p.c.tracing.RPCTracingHelpers - [requestId=4159074315044] Received tag from RPC request deleteStream-testStreamsAndScopesManagement0-1.
**controller**:2018-10-10 17:24:05,608 1130761 [grpc-default-executor-18] INFO  i.p.c.s.r.g.v.ControllerServiceImpl - [requestId=4159074315044] deleteStream called for stream testStreamsAndScopesManagement0/1.

A DeleteStreamTask takes over the request and processes it, including the operations against the Segment Store:

**controller**:2018-10-10 17:24:05,621 1130774 [controllerpool-63] INFO  i.p.c.s.e.r.DeleteStreamTask - [requestId=4159074315044] testStreamsAndScopesManagement0/1 deleting segments
**controller**:2018-10-10 17:24:06,085 1131238 [epollEventLoopGroup-2-2] DEBUG i.p.c.n.i.ClientConnectionInboundHandler - 172.16.18.22 processing reply: WireCommands.SegmentDeleted(type=SEGMENT_DELETED, requestId=4159074315044, segment=testStreamsAndScopesManagement0/1/0.#epoch.0)
**controller**:2018-10-10 17:24:06,085 1131238 [epollEventLoopGroup-2-2] INFO  i.p.controller.server.SegmentHelper - [requestId=4159074315044] deleteSegment testStreamsAndScopesManagement0/1/0.#epoch.0 SegmentDeleted
**controller**:2018-10-10 17:24:06,085 1131238 [epollEventLoopGroup-2-2] WARN  i.p.controller.server.SegmentHelper - [requestId=4159074315044] deleteSegment testStreamsAndScopesManagement0/1/0.#epoch.0 Connection dropped

A Segment Store server receives and processes the request for both segments (also in Tier-2):

**segmentstore**:2018-10-10 17:24:05,630 992044 [epollEventLoopGroup-3-3] DEBUG i.p.s.s.h.h.ServerConnectionInboundHandler - Processing request: WireCommands.DeleteSegment(type=DELETE_SEGMENT, requestId=4159074315044, segment=testStreamsAndScopesManagement0/1/0.#epoch.0, delegationToken=)
**segmentstore**:2018-10-10 17:24:05,630 992044 [epollEventLoopGroup-3-3] INFO  i.p.common.tracing.RequestTracker - Tags not found for this request: requestId=4159074315044, descriptor=deleteSegment-testStreamsAndScopesManagement0/1/0.#epoch.0. Create request tag at this point.
**segmentstore**:2018-10-10 17:24:05,630 992044 [epollEventLoopGroup-3-3] DEBUG i.p.common.tracing.RequestTracker - Tracking request deleteSegment-testStreamsAndScopesManagement0/1/0.#epoch.0 with id 4159074315044. Current ongoing requests: 5.
**segmentstore**:2018-10-10 17:24:05,630 992044 [epollEventLoopGroup-3-3] DEBUG i.p.common.tracing.RequestTracker - Tracking request deleteSegment-testStreamsAndScopesManagement0/1/0.#epoch.0$state with id 4159074315044. Current ongoing requests: 6.
**segmentstore**:2018-10-10 17:24:05,630 992044 [epollEventLoopGroup-3-3] DEBUG i.p.common.tracing.RequestTracker - Tracking request deleteSegment-testStreamsAndScopesManagement0/1/0.#epoch.0$attributes with id 4159074315044. Current ongoing requests: 7.
**segmentstore**:2018-10-10 17:24:05,630 992044 [epollEventLoopGroup-3-3] INFO  i.p.s.s.h.h.PravegaRequestProcessor - [requestId=4159074315044] Deleting segment WireCommands.DeleteSegment(type=DELETE_SEGMENT, requestId=4159074315044, segment=testStreamsAndScopesManagement0/1/0.#epoch.0, delegationToken=)
**segmentstore**:2018-10-10 17:24:05,630 992044 [epollEventLoopGroup-3-3] DEBUG i.p.s.s.c.StreamSegmentContainer - [requestId=4159074315044] SegmentContainer[0]: deleteStreamSegment [testStreamsAndScopesManagement0/1/0.#epoch.0]
**segmentstore**:2018-10-10 17:24:05,646 992060 [storage-io-16] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=4159074315044] Deleted segment 'testStreamsAndScopesManagement0/1/0.#epoch.0'.
**segmentstore**:2018-10-10 17:24:05,646 992060 [storage-io-16] DEBUG i.p.common.tracing.RequestTracker - Untracking request deleteSegment-testStreamsAndScopesManagement0/1/0.#epoch.0 with id [4159074315044]. Current ongoing requests: 6.
**segmentstore**:2018-10-10 17:24:06,037 992451 [storage-io-13] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=4159074315044] Deleted segment 'testStreamsAndScopesManagement0/1/0.#epoch.0$attributes'.
**segmentstore**:2018-10-10 17:24:06,037 992451 [storage-io-13] DEBUG i.p.common.tracing.RequestTracker - Untracking request deleteSegment-testStreamsAndScopesManagement0/1/0.#epoch.0$attributes with id [4159074315044]. Current ongoing requests: 3.
**segmentstore**:2018-10-10 17:24:06,079 992493 [storage-io-17] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=4159074315044] Deleted SegmentChunk 'testStreamsAndScopesManagement0/1/0.#epoch.0$state$offset.0 (0+143, deleted)' for 'testStreamsAndScopesManagement0/1/0.#epoch.0$state (Sealed, RW, SegmentChunks=1)'.
**segmentstore**:2018-10-10 17:24:06,084 992498 [storage-io-17] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=4159074315044] Deleted segment 'testStreamsAndScopesManagement0/1/0.#epoch.0$state'.
**segmentstore**:2018-10-10 17:24:06,084 992498 [storage-io-17] DEBUG i.p.common.tracing.RequestTracker - Untracking request deleteSegment-testStreamsAndScopesManagement0/1/0.#epoch.0$state with id [4159074315044]. Current ongoing requests: 1.

The Controller untracks the request from RequestTracker and replies to the client:

**controller**:2018-10-10 17:24:06,231 1131384 [controllerpool-37] DEBUG i.p.common.tracing.RequestTracker - Untracking request deleteStream-testStreamsAndScopesManagement0-1 with id [4159074315044]. Current ongoing requests: 0.
**controller**:2018-10-10 17:24:06,231 1131384 [controllerpool-37] INFO  i.p.c.s.r.g.v.ControllerServiceImpl - [requestId=4159074315044] Untracking request: deleteStream-testStreamsAndScopesManagement0-1.

The client receives the confirmation that the stream has been correctly deleted:

**client**:2018-10-10 17:24:06,234 3612 [grpc-default-executor-0] INFO  i.p.c.stream.impl.ControllerImpl - [requestId=4159074315044] Successfully deleted stream: 1
**client**:2018-10-10 17:24:06,235 3613 [grpc-default-executor-0] TRACE i.p.c.stream.impl.ControllerImpl - LEAVE deleteStream@4159074315044 (elapsed=629796us).

commitTransaction

Interpretation of logs after executing grep on request id 10052660475608 (transaction 00000000011b):

First, the Controller (CommitRequestHandler) gets a new event to commit a set of transactions:

**controller**:2018-10-10 19:02:21,050 7026203 [controllerpool-12] DEBUG i.p.c.s.e.r.CommitRequestHandler - [requestId=10052660475608] Committing transaction 00000000-0000-0000-0000-00000000011b on stream testAutoScale382970353/testTxnScaleUpWithRead
**controller**:2018-10-10 19:02:21,172 7026325 [epollEventLoopGroup-2-7] DEBUG i.p.c.n.i.ClientConnectionInboundHandler - 172.16.18.22 processing reply: WireCommands.SegmentsMerged(type=SEGMENTS_MERGED, requestId=10052660475608, target=testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0, source=testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b)
**controller**:2018-10-10 19:02:21,172 7026325 [epollEventLoopGroup-2-7] DEBUG i.p.controller.server.SegmentHelper - [requestId=10052660475608] commitTransaction testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b TransactionCommitted
**controller**:2018-10-10 19:02:21,172 7026325 [epollEventLoopGroup-2-7] WARN  i.p.controller.server.SegmentHelper - [requestId=10052660475608] commitTransaction testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b connection dropped

The request for merging the transaction segment to the main stream is handled by the Segment Store server:

**segmentstore**:2018-10-10 19:02:21,052 6887466 [epollEventLoopGroup-3-2] DEBUG i.p.s.s.h.h.ServerConnectionInboundHandler - Processing request: WireCommands.MergeSegments(type=MERGE_SEGMENTS, requestId=10052660475608, target=testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0, source=testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b, delegationToken=)
**segmentstore**:2018-10-10 19:02:21,052 6887466 [epollEventLoopGroup-3-2] INFO  i.p.common.tracing.RequestTracker - Tags not found for this request: requestId=10052660475608, descriptor=mergeSegments-testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b-testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0. Create request tag at this point.
**segmentstore**:2018-10-10 19:02:21,052 6887466 [epollEventLoopGroup-3-2] DEBUG i.p.common.tracing.RequestTracker - Tracking request mergeSegments-testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b-testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0 with id 10052660475608. Current ongoing requests: 792.

Apart from tracking the mergeSegment operation, typically merging two segments entails to seal and delete the source (transaction) segment. For this reason, we internally propagate the requestId to track these future operations:

**segmentstore**:2018-10-10 19:02:21,053 6887467 [epollEventLoopGroup-3-2] DEBUG i.p.common.tracing.RequestTracker - Tracking request sealSegment-testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b with id 10052660475608. Current ongoing requests: 793.
**segmentstore**:2018-10-10 19:02:21,054 6887468 [epollEventLoopGroup-3-3] DEBUG i.p.common.tracing.RequestTracker - Tracking request sealSegment-testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b with id 10052660475608. Current ongoing requests: 800.
**segmentstore**:2018-10-10 19:02:21,055 6887469 [epollEventLoopGroup-3-3] DEBUG i.p.common.tracing.RequestTracker - Tracking request sealSegment-testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b$state with id 10052660475608. Current ongoing requests: 801.
**segmentstore**:2018-10-10 19:02:21,055 6887469 [epollEventLoopGroup-3-3] DEBUG i.p.common.tracing.RequestTracker - Tracking request sealSegment-testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b$attributes with id 10052660475608. Current ongoing requests: 802.
**segmentstore**:2018-10-10 19:02:21,055 6887469 [epollEventLoopGroup-3-3] DEBUG i.p.common.tracing.RequestTracker - Tracking request deleteSegment-testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b with id 10052660475608. Current ongoing requests: 803.
**segmentstore**:2018-10-10 19:02:21,055 6887469 [epollEventLoopGroup-3-3] DEBUG i.p.common.tracing.RequestTracker - Tracking request deleteSegment-testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b$state with id 10052660475608. Current ongoing requests: 804.
**segmentstore**:2018-10-10 19:02:21,056 6887470 [epollEventLoopGroup-3-3] DEBUG i.p.common.tracing.RequestTracker - Tracking request deleteSegment-testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b$attributes with id 10052660475608. Current ongoing requests: 805.

The Segment Store server proceeds with the mergeSegments operations, as well as with the seal and delete of the transaction segment:

**segmentstore**:2018-10-10 19:02:21,054 6887468 [epollEventLoopGroup-3-2] INFO  i.p.s.s.h.h.PravegaRequestProcessor - [requestId=10052660475608] Merging Segments WireCommands.MergeSegments(type=MERGE_SEGMENTS, requestId=10052660475608, target=testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0, source=testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b, delegationToken=)
**segmentstore**:2018-10-10 19:02:21,054 6887468 [epollEventLoopGroup-3-2] DEBUG i.p.s.s.c.StreamSegmentContainer - [requestId=10052660475608] SegmentContainer[1]: mergeStreamSegment [testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b, testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0]
**segmentstore**:2018-10-10 19:02:21,056 6887470 [epollEventLoopGroup-3-3] INFO  i.p.s.s.h.h.PravegaRequestProcessor - [requestId=10052660475608] Merging Segments WireCommands.MergeSegments(type=MERGE_SEGMENTS, requestId=10052660475608, target=testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0, source=testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b, delegationToken=)
**segmentstore**:2018-10-10 19:02:21,056 6887470 [epollEventLoopGroup-3-3] DEBUG i.p.s.s.c.StreamSegmentContainer - [requestId=10052660475608] SegmentContainer[3]: mergeStreamSegment [testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b, testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0]
**segmentstore**:2018-10-10 19:02:21,124 6887538 [storage-io-20] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=10052660475608] Sealed active SegmentChunk 'testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b$state$offset.0' for 'testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b$state'.
**segmentstore**:2018-10-10 19:02:21,133 6887547 [core-14] DEBUG i.p.s.s.c.StreamSegmentContainer - [requestId=10052660475608] SegmentContainer[3]: Deleting empty source segment instead of merging testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b.
**segmentstore**:2018-10-10 19:02:21,133 6887547 [core-14] DEBUG i.p.s.s.c.StreamSegmentContainer - [requestId=10052660475608] SegmentContainer[3]: deleteStreamSegment [testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b]
**segmentstore**:2018-10-10 19:02:21,149 6887563 [storage-io-20] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=10052660475608] Sealed Header for 'testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b$state'.
**segmentstore**:2018-10-10 19:02:21,149 6887563 [storage-io-20] DEBUG i.p.common.tracing.RequestTracker - Untracking request sealSegment-testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b$state with id [10052660475608]. Current ongoing requests: 802.
**segmentstore**:2018-10-10 19:02:21,165 6887579 [storage-io-20] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=10052660475608] Deleted SegmentChunk 'testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b$state$offset.0 (0+200, deleted)' for 'testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b$state (Sealed, RW, SegmentChunks=1)'.
**segmentstore**:2018-10-10 19:02:21,171 6887585 [storage-io-20] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=10052660475608] Deleted segment 'testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b$state'.
**segmentstore**:2018-10-10 19:02:21,171 6887585 [storage-io-20] DEBUG i.p.common.tracing.RequestTracker - Untracking request deleteSegment-testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b$state with id [10052660475608]. Current ongoing requests: 801.
**segmentstore**:2018-10-10 19:02:21,175 6887589 [storage-io-15] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=10052660475608] Sealed Header for 'testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b'.
**segmentstore**:2018-10-10 19:02:21,175 6887589 [storage-io-15] DEBUG i.p.common.tracing.RequestTracker - Untracking request sealSegment-testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b with id [10052660475608]. Current ongoing requests: 800.
**segmentstore**:2018-10-10 19:02:21,182 6887596 [storage-io-15] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=10052660475608] Deleted segment 'testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b'.
**segmentstore**:2018-10-10 19:02:21,183 6887597 [storage-io-15] DEBUG i.p.common.tracing.RequestTracker - Untracking request deleteSegment-testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b with id [10052660475608]. Current ongoing requests: 805.
**segmentstore**:2018-10-10 19:02:21,250 6887664 [storage-io-18] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=10052660475608] Sealed active SegmentChunk 'testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b$state$offset.0' for 'testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b$state'.
**segmentstore**:2018-10-10 19:02:21,258 6887672 [storage-io-18] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=10052660475608] Sealed Header for 'testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b$state'.
**segmentstore**:2018-10-10 19:02:21,258 6887672 [storage-io-18] DEBUG i.p.common.tracing.RequestTracker - Untracking request sealSegment-testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b$state with id [10052660475608]. Current ongoing requests: 803.
**segmentstore**:2018-10-10 19:02:21,269 6887683 [storage-io-18] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=10052660475608] Deleted SegmentChunk 'testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b$state$offset.0 (0+200, deleted)' for 'testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b$state (Sealed, RW, SegmentChunks=1)'.
**segmentstore**:2018-10-10 19:02:21,275 6887689 [storage-io-18] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=10052660475608] Deleted segment 'testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b$state'.
**segmentstore**:2018-10-10 19:02:21,275 6887689 [storage-io-18] DEBUG i.p.common.tracing.RequestTracker - Untracking request deleteSegment-testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b$state with id [10052660475608]. Current ongoing requests: 802.
**segmentstore**:2018-10-10 19:02:21,553 6887967 [storage-io-2] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=10052660475608] Sealed active SegmentChunk 'testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b$offset.0' for 'testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b'.
**segmentstore**:2018-10-10 19:02:21,562 6887976 [storage-io-2] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=10052660475608] Sealed Header for 'testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b'.
**segmentstore**:2018-10-10 19:02:21,562 6887976 [storage-io-2] DEBUG i.p.common.tracing.RequestTracker - Untracking request sealSegment-testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b with id [10052660475608]. Current ongoing requests: 802.
**segmentstore**:2018-10-10 19:02:21,614 6888028 [storage-io-15] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=10052660475608] Concat 'testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b (Sealed, RW, SegmentChunks=1)' into 'testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0 (Not Sealed, RW, SegmentChunks=1)' using native method.
**segmentstore**:2018-10-10 19:02:21,671 6888085 [storage-io-6] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=10052660475608] Sealed Header for 'testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b$attributes'.
**segmentstore**:2018-10-10 19:02:21,671 6888085 [storage-io-6] DEBUG i.p.common.tracing.RequestTracker - Untracking request sealSegment-testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b$attributes with id [10052660475608]. Current ongoing requests: 798.
**segmentstore**:2018-10-10 19:02:21,677 6888091 [storage-io-6] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=10052660475608] Deleted segment 'testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b$attributes'.
**segmentstore**:2018-10-10 19:02:21,677 6888091 [storage-io-6] DEBUG i.p.common.tracing.RequestTracker - Untracking request deleteSegment-testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.0000000000000000000000000000011b$attributes with id [10052660475608]. Current ongoing requests: 797.

When the mergeSegments request finished successfully, the Controller is notified that the transaction has been committed:

**controller**:2018-10-10 19:02:21,301 7026454 [epollEventLoopGroup-2-8] DEBUG i.p.c.n.i.ClientConnectionInboundHandler - 172.16.18.22 processing reply: WireCommands.SegmentsMerged(type=SEGMENTS_MERGED, requestId=10052660475608, target=testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0, source=testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b)
**controller**:2018-10-10 19:02:21,301 7026454 [epollEventLoopGroup-2-8] DEBUG i.p.controller.server.SegmentHelper - [requestId=10052660475608] commitTransaction testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b TransactionCommitted
**controller**:2018-10-10 19:02:21,301 7026454 [epollEventLoopGroup-2-8] WARN  i.p.controller.server.SegmentHelper - [requestId=10052660475608] commitTransaction testAutoScale382970353/testTxnScaleUpWithRead/1.#epoch.0#transaction.0000000000000000000000000000011b connection dropped
**controller**:2018-10-10 19:02:21,318 7026471 [controllerpool-5] DEBUG i.p.c.s.e.r.CommitRequestHandler - [requestId=10052660475608] transaction 00000000-0000-0000-0000-00000000011b on stream testAutoScale382970353/testTxnScaleUpWithRead committed successfully

retention

Interpretation of logs after executing grep on request id 8096246361021:

The Controller retention service (StreamCutBucketService) considers that is time to truncate stream testRetentionScope195665459/testRetentionStream:

**controller**:2018-10-10 09:06:37,792 5222263 [retentionpool-1] DEBUG i.p.c.s.r.StreamCutBucketService - [requestId=8096246361021] Periodic background processing for retention called for stream testRetentionScope195665459/testRetentionStream
**controller**:2018-10-10 09:06:38,837 5223308 [controllerpool-58] DEBUG i.p.c.t.Stream.StreamMetadataTasks - [requestId=8096246361021] Started truncation request for stream testRetentionScope195665459/testRetentionStream
**controller**:2018-10-10 09:06:39,401 5223872 [controllerpool-28] INFO  i.p.c.s.e.r.TruncateStreamTask - [requestId=8096246361021] Truncating stream testRetentionScope195665459/testRetentionStream at stream cut: {0=0, 1=43}
**controller**:2018-10-10 09:06:39,408 5223879 [controllerpool-1] DEBUG i.p.c.s.e.r.TruncateStreamTask - [requestId=8096246361021] testRetentionScope195665459/testRetentionStream truncating segments
**controller**:2018-10-10 09:06:39,416 5223887 [epollEventLoopGroup-2-1] DEBUG i.p.c.n.i.ClientConnectionInboundHandler - 172.16.18.22 processing reply: WireCommands.SegmentTruncated(type=SEGMENT_TRUNCATED, requestId=8096246361021, segment=testRetentionScope195665459/testRetentionStream/0.#epoch.0)
**controller**:2018-10-10 09:06:39,416 5223887 [epollEventLoopGroup-2-1] INFO  i.p.controller.server.SegmentHelper - [requestId=8096246361021] truncateSegment testRetentionScope195665459/testRetentionStream/0.#epoch.0 SegmentTruncated
**controller**:2018-10-10 09:06:39,417 5223888 [epollEventLoopGroup-2-1] WARN  i.p.controller.server.SegmentHelper - [requestId=8096246361021] truncateSegment testRetentionScope195665459/testRetentionStream/0.#epoch.0 Connection dropped
**controller**:2018-10-10 09:06:39,418 5223889 [epollEventLoopGroup-2-2] DEBUG i.p.c.n.i.ClientConnectionInboundHandler - 172.16.18.22 processing reply: WireCommands.SegmentTruncated(type=SEGMENT_TRUNCATED, requestId=8096246361021, segment=testRetentionScope195665459/testRetentionStream/1.#epoch.0)
**controller**:2018-10-10 09:06:39,418 5223889 [epollEventLoopGroup-2-2] INFO  i.p.controller.server.SegmentHelper - [requestId=8096246361021] truncateSegment testRetentionScope195665459/testRetentionStream/1.#epoch.0 SegmentTruncated
**controller**:2018-10-10 09:06:39,418 5223889 [epollEventLoopGroup-2-2] DEBUG i.p.c.s.e.r.TruncateStreamTask - [requestId=8096246361021] testRetentionScope195665459/testRetentionStream deleting segments []
**controller**:2018-10-10 09:06:39,419 5223890 [epollEventLoopGroup-2-2] WARN  i.p.controller.server.SegmentHelper - [requestId=8096246361021] truncateSegment testRetentionScope195665459/testRetentionStream/1.#epoch.0 Connection dropped

The Controller issues the requests to truncate the segments of the stream in the Segment Store:

**segmentstore**:2018-10-10 09:06:39,412 5086551 [epollEventLoopGroup-3-7] DEBUG i.p.s.s.h.h.ServerConnectionInboundHandler - Processing request: WireCommands.TruncateSegment(type=TRUNCATE_SEGMENT, requestId=8096246361021, segment=testRetentionScope195665459/testRetentionStream/1.#epoch.0, truncationOffset=43, delegationToken=)
**segmentstore**:2018-10-10 09:06:39,412 5086551 [epollEventLoopGroup-3-6] DEBUG i.p.s.s.h.h.ServerConnectionInboundHandler - Processing request: WireCommands.TruncateSegment(type=TRUNCATE_SEGMENT, requestId=8096246361021, segment=testRetentionScope195665459/testRetentionStream/0.#epoch.0, truncationOffset=0, delegationToken=)
**segmentstore**:2018-10-10 09:06:39,412 5086551 [epollEventLoopGroup-3-7] INFO  i.p.common.tracing.RequestTracker - Tags not found for this request: requestId=8096246361021, descriptor=truncateSegment-testRetentionScope195665459/testRetentionStream/1.#epoch.0. Create request tag at this point.
**segmentstore**:2018-10-10 09:06:39,412 5086551 [epollEventLoopGroup-3-6] INFO  i.p.common.tracing.RequestTracker - Tags not found for this request: requestId=8096246361021, descriptor=truncateSegment-testRetentionScope195665459/testRetentionStream/0.#epoch.0. Create request tag at this point.
**segmentstore**:2018-10-10 09:06:39,412 5086551 [epollEventLoopGroup-3-7] INFO  i.p.common.tracing.RequestTracker - Tracking request truncateSegment-testRetentionScope195665459/testRetentionStream/1.#epoch.0 with id 8096246361021. Current ongoing requests: 4.
**segmentstore**:2018-10-10 09:06:39,412 5086551 [epollEventLoopGroup-3-7] INFO  i.p.s.s.h.h.PravegaRequestProcessor - [requestId=8096246361021] Truncating segment testRetentionScope195665459/testRetentionStream/1.#epoch.0 at offset 43.
**segmentstore**:2018-10-10 09:06:39,412 5086551 [epollEventLoopGroup-3-6] INFO  i.p.common.tracing.RequestTracker - Tracking request truncateSegment-testRetentionScope195665459/testRetentionStream/0.#epoch.0 with id 8096246361021. Current ongoing requests: 5.
**segmentstore**:2018-10-10 09:06:39,412 5086551 [epollEventLoopGroup-3-6] INFO  i.p.s.s.h.h.PravegaRequestProcessor - [requestId=8096246361021] Truncating segment testRetentionScope195665459/testRetentionStream/0.#epoch.0 at offset 0.
**segmentstore**:2018-10-10 09:06:39,412 5086551 [epollEventLoopGroup-3-6] DEBUG i.p.s.s.c.StreamSegmentContainer - [requestId=8096246361021] SegmentContainer[2]: truncateStreamSegment [testRetentionScope195665459/testRetentionStream/0.#epoch.0]
**segmentstore**:2018-10-10 09:06:39,412 5086551 [epollEventLoopGroup-3-7] DEBUG i.p.s.s.c.StreamSegmentContainer - [requestId=8096246361021] SegmentContainer[3]: truncateStreamSegment [testRetentionScope195665459/testRetentionStream/1.#epoch.0]
**segmentstore**:2018-10-10 09:06:39,417 5086556 [storage-io-16] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=8096246361021] Truncated segment 'testRetentionScope195665459/testRetentionStream/0.#epoch.0'.
**segmentstore**:2018-10-10 09:06:39,417 5086556 [storage-io-16] INFO  i.p.common.tracing.RequestTracker - Untracking request truncateSegment-testRetentionScope195665459/testRetentionStream/0.#epoch.0 with id [8096246361021]. Current ongoing requests: 4.
**segmentstore**:2018-10-10 09:06:39,418 5086557 [storage-io-4] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=8096246361021] Truncated segment 'testRetentionScope195665459/testRetentionStream/1.#epoch.0'.
**segmentstore**:2018-10-10 09:06:39,418 5086557 [storage-io-4] INFO  i.p.common.tracing.RequestTracker - Untracking request truncateSegment-testRetentionScope195665459/testRetentionStream/1.#epoch.0 with id [8096246361021]. Current ongoing requests: 3.

autoScale

Interpretation of logs after executing grep on request id 1539198139871:

A Segment Store server reaches the maximum load defined for a stream segment, so it decides to trigger a scale operation to split that segment and then share the load across nodes:

**segmentstore**:2018-10-10 19:02:19,871 6886285 [storage-io-16] INFO  i.p.s.s.h.stat.AutoScaleProcessor - [requestId=1539198139871] sending request for scale up for testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0
**segmentstore**:2018-10-10 19:02:19,905 6886319 [epollEventLoopGroup-5-2] DEBUG i.p.s.s.h.stat.AutoScaleProcessor - [requestId=1539198139871] scale event posted successfully

The request is posted in the internal Pravega stream and it is received and processed by the Controller:

**controller**:2018-10-10 19:02:21,417 7026570 [controllerpool-25] INFO  i.p.c.s.e.r.AutoScaleTask - [requestId=1539198139871] Scale up request received for stream segment testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0
**controller**:2018-10-10 19:02:21,461 7026614 [controllerpool-5] INFO  i.p.c.s.e.r.ScaleOperationTask - [requestId=1539198139871] starting scale request for testAutoScale382970353/testTxnScaleUpWithRead segments [0] to new ranges [0.0=0.25, 0.25=0.5]

The orchestration at the Controller side requires to create two (or more) segments:

**controller**:2018-10-10 19:02:22,512 7027665 [epollEventLoopGroup-2-5] DEBUG i.p.c.n.i.ClientConnectionInboundHandler - 172.16.18.22 processing reply: WireCommands.SegmentCreated(type=SEGMENT_CREATED, requestId=1539198139871, segment=testAutoScale382970353/testTxnScaleUpWithRead/2.#epoch.1)
**controller**:2018-10-10 19:02:22,512 7027665 [epollEventLoopGroup-2-5] INFO  i.p.controller.server.SegmentHelper - [requestId=1539198139871] CreateSegment testAutoScale382970353/testTxnScaleUpWithRead/2.#epoch.1 SegmentCreated
**controller**:2018-10-10 19:02:22,512 7027665 [epollEventLoopGroup-2-5] WARN  i.p.controller.server.SegmentHelper - [requestId=1539198139871] CreateSegment testAutoScale382970353/testTxnScaleUpWithRead/2.#epoch.1 Connection dropped
**controller**:2018-10-10 19:02:22,597 7027750 [epollEventLoopGroup-2-6] DEBUG i.p.c.n.i.ClientConnectionInboundHandler - 172.16.18.22 processing reply: WireCommands.SegmentCreated(type=SEGMENT_CREATED, requestId=1539198139871, segment=testAutoScale382970353/testTxnScaleUpWithRead/3.#epoch.1)
**controller**:2018-10-10 19:02:22,597 7027750 [epollEventLoopGroup-2-6] INFO  i.p.controller.server.SegmentHelper - [requestId=1539198139871] CreateSegment testAutoScale382970353/testTxnScaleUpWithRead/3.#epoch.1 SegmentCreated
**controller**:2018-10-10 19:02:22,597 7027750 [epollEventLoopGroup-2-6] WARN  i.p.controller.server.SegmentHelper - [requestId=1539198139871] CreateSegment testAutoScale382970353/testTxnScaleUpWithRead/3.#epoch.1 Connection dropped

These createSegment operations are reflected at the Segment Store:

**segmentstore**:2018-10-10 19:02:19,905 6886319 [epollEventLoopGroup-5-2] DEBUG i.p.s.s.h.stat.AutoScaleProcessor - [requestId=1539198139871] scale event posted successfully
**segmentstore**:2018-10-10 19:02:21,492 6887906 [epollEventLoopGroup-3-8] DEBUG i.p.s.s.h.h.ServerConnectionInboundHandler - Processing request: WireCommands.CreateSegment(type=CREATE_SEGMENT, requestId=1539198139871, segment=testAutoScale382970353/testTxnScaleUpWithRead/2.#epoch.1, scaleType=2, targetRate=1, delegationToken=)
**segmentstore**:2018-10-10 19:02:21,492 6887906 [epollEventLoopGroup-3-8] INFO  i.p.common.tracing.RequestTracker - Tags not found for this request: requestId=1539198139871, descriptor=createSegment-testAutoScale382970353/testTxnScaleUpWithRead/2.#epoch.1. Create request tag at this point.
**segmentstore**:2018-10-10 19:02:21,492 6887906 [epollEventLoopGroup-3-1] DEBUG i.p.s.s.h.h.ServerConnectionInboundHandler - Processing request: WireCommands.CreateSegment(type=CREATE_SEGMENT, requestId=1539198139871, segment=testAutoScale382970353/testTxnScaleUpWithRead/3.#epoch.1, scaleType=2, targetRate=1, delegationToken=)
**segmentstore**:2018-10-10 19:02:21,492 6887906 [epollEventLoopGroup-3-1] INFO  i.p.common.tracing.RequestTracker - Tags not found for this request: requestId=1539198139871, descriptor=createSegment-testAutoScale382970353/testTxnScaleUpWithRead/3.#epoch.1. Create request tag at this point.
**segmentstore**:2018-10-10 19:02:21,492 6887906 [epollEventLoopGroup-3-8] DEBUG i.p.common.tracing.RequestTracker - Tracking request createSegment-testAutoScale382970353/testTxnScaleUpWithRead/2.#epoch.1 with id 1539198139871. Current ongoing requests: 801.
**segmentstore**:2018-10-10 19:02:21,493 6887907 [epollEventLoopGroup-3-8] DEBUG i.p.common.tracing.RequestTracker - Tracking request createSegment-testAutoScale382970353/testTxnScaleUpWithRead/2.#epoch.1$state with id 1539198139871. Current ongoing requests: 802.
**segmentstore**:2018-10-10 19:02:21,493 6887907 [epollEventLoopGroup-3-8] DEBUG i.p.common.tracing.RequestTracker - Tracking request createSegment-testAutoScale382970353/testTxnScaleUpWithRead/2.#epoch.1$attributes with id 1539198139871. Current ongoing requests: 803.
**segmentstore**:2018-10-10 19:02:21,493 6887907 [epollEventLoopGroup-3-8] INFO  i.p.s.s.h.h.PravegaRequestProcessor - [requestId=1539198139871] Creating stream segment WireCommands.CreateSegment(type=CREATE_SEGMENT, requestId=1539198139871, segment=testAutoScale382970353/testTxnScaleUpWithRead/2.#epoch.1, scaleType=2, targetRate=1, delegationToken=).
**segmentstore**:2018-10-10 19:02:21,493 6887907 [epollEventLoopGroup-3-8] DEBUG i.p.s.s.c.StreamSegmentContainer - [requestId=1539198139871] SegmentContainer[1]: createStreamSegment [testAutoScale382970353/testTxnScaleUpWithRead/2.#epoch.1]
**segmentstore**:2018-10-10 19:02:21,493 6887907 [epollEventLoopGroup-3-1] DEBUG i.p.common.tracing.RequestTracker - Tracking request createSegment-testAutoScale382970353/testTxnScaleUpWithRead/3.#epoch.1 with id 1539198139871. Current ongoing requests: 804.
**segmentstore**:2018-10-10 19:02:21,494 6887908 [epollEventLoopGroup-3-1] DEBUG i.p.common.tracing.RequestTracker - Tracking request createSegment-testAutoScale382970353/testTxnScaleUpWithRead/3.#epoch.1$state with id 1539198139871. Current ongoing requests: 805.
**segmentstore**:2018-10-10 19:02:21,494 6887908 [epollEventLoopGroup-3-1] DEBUG i.p.common.tracing.RequestTracker - Tracking request createSegment-testAutoScale382970353/testTxnScaleUpWithRead/3.#epoch.1$attributes with id 1539198139871. Current ongoing requests: 806.
**segmentstore**:2018-10-10 19:02:21,494 6887908 [epollEventLoopGroup-3-1] INFO  i.p.s.s.h.h.PravegaRequestProcessor - [requestId=1539198139871] Creating stream segment WireCommands.CreateSegment(type=CREATE_SEGMENT, requestId=1539198139871, segment=testAutoScale382970353/testTxnScaleUpWithRead/3.#epoch.1, scaleType=2, targetRate=1, delegationToken=).
**segmentstore**:2018-10-10 19:02:21,494 6887908 [epollEventLoopGroup-3-1] DEBUG i.p.s.s.c.StreamSegmentContainer - [requestId=1539198139871] SegmentContainer[0]: createStreamSegment [testAutoScale382970353/testTxnScaleUpWithRead/3.#epoch.1]
**segmentstore**:2018-10-10 19:02:21,541 6887955 [storage-io-7] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=1539198139871] Header for 'testAutoScale382970353/testTxnScaleUpWithRead/2.#epoch.1' fully serialized to 'testAutoScale382970353/testTxnScaleUpWithRead/2.#epoch.1$header'.
**segmentstore**:2018-10-10 19:02:21,541 6887955 [storage-io-7] DEBUG i.p.common.tracing.RequestTracker - Untracking request createSegment-testAutoScale382970353/testTxnScaleUpWithRead/2.#epoch.1 with id [1539198139871]. Current ongoing requests: 804.
**segmentstore**:2018-10-10 19:02:21,948 6888362 [storage-io-18] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=1539198139871] Header for 'testAutoScale382970353/testTxnScaleUpWithRead/3.#epoch.1' fully serialized to 'testAutoScale382970353/testTxnScaleUpWithRead/3.#epoch.1$header'.
**segmentstore**:2018-10-10 19:02:21,949 6888363 [storage-io-18] DEBUG i.p.common.tracing.RequestTracker - Untracking request createSegment-testAutoScale382970353/testTxnScaleUpWithRead/3.#epoch.1 with id [1539198139871]. Current ongoing requests: 793.
**segmentstore**:2018-10-10 19:02:22,014 6888428 [storage-io-9] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=1539198139871] Header for 'testAutoScale382970353/testTxnScaleUpWithRead/2.#epoch.1$state' fully serialized to 'testAutoScale382970353/testTxnScaleUpWithRead/2.#epoch.1$state$header'.
**segmentstore**:2018-10-10 19:02:22,014 6888428 [storage-io-9] DEBUG i.p.common.tracing.RequestTracker - Untracking request createSegment-testAutoScale382970353/testTxnScaleUpWithRead/2.#epoch.1$state with id [1539198139871]. Current ongoing requests: 791.
**segmentstore**:2018-10-10 19:02:22,057 6888471 [storage-io-7] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=1539198139871] Header for 'testAutoScale382970353/testTxnScaleUpWithRead/3.#epoch.1$state' fully serialized to 'testAutoScale382970353/testTxnScaleUpWithRead/3.#epoch.1$state$header'.
**segmentstore**:2018-10-10 19:02:22,057 6888471 [storage-io-7] DEBUG i.p.common.tracing.RequestTracker - Untracking request createSegment-testAutoScale382970353/testTxnScaleUpWithRead/3.#epoch.1$state with id [1539198139871]. Current ongoing requests: 790.

Moreover, the Controller should seal the original segment:

**controller**:2018-10-10 19:02:22,619 7027772 [epollEventLoopGroup-2-3] DEBUG i.p.c.n.i.ClientConnectionInboundHandler - 172.16.18.22 processing reply: WireCommands.SegmentSealed(type=SEGMENT_SEALED, requestId=1539198139871, segment=testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0)
**controller**:2018-10-10 19:02:22,619 7027772 [epollEventLoopGroup-2-3] INFO  i.p.controller.server.SegmentHelper - [requestId=1539198139871] sealSegment testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0 segmentSealed
**controller**:2018-10-10 19:02:22,620 7027773 [epollEventLoopGroup-2-3] WARN  i.p.controller.server.SegmentHelper - [requestId=1539198139871] sealSegment testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0 connectionDropped
**controller**:2018-10-10 19:02:22,646 7027799 [controllerpool-51] INFO  i.p.c.s.e.r.ScaleOperationTask - [requestId=1539198139871] scale processing for testAutoScale382970353/testTxnScaleUpWithRead epoch 0 completed.
**controller**:2018-10-10 19:02:22,646 7027799 [controllerpool-51] INFO  i.p.c.s.e.r.ScaleOperationTask - [requestId=1539198139871] scale request for testAutoScale382970353/testTxnScaleUpWithRead segments [0] to new ranges [0.0=0.25, 0.25=0.5] completed successfully.

And this sealSegment operation is also reflected at the Segment Store:

**segmentstore**:2018-10-10 19:02:22,609 6889023 [epollEventLoopGroup-3-2] DEBUG i.p.s.s.h.h.ServerConnectionInboundHandler - Processing request: WireCommands.SealSegment(type=SEAL_SEGMENT, requestId=1539198139871, segment=testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0, delegationToken=)
**segmentstore**:2018-10-10 19:02:22,609 6889023 [epollEventLoopGroup-3-2] INFO  i.p.common.tracing.RequestTracker - Tags not found for this request: requestId=1539198139871, descriptor=sealSegment-testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0. Create request tag at this point.
**segmentstore**:2018-10-10 19:02:22,609 6889023 [epollEventLoopGroup-3-2] DEBUG i.p.common.tracing.RequestTracker - Tracking request sealSegment-testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0 with id 1539198139871. Current ongoing requests: 799.
**segmentstore**:2018-10-10 19:02:22,610 6889024 [epollEventLoopGroup-3-2] DEBUG i.p.common.tracing.RequestTracker - Tracking request sealSegment-testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0$state with id 1539198139871. Current ongoing requests: 800.
**segmentstore**:2018-10-10 19:02:22,610 6889024 [epollEventLoopGroup-3-2] DEBUG i.p.common.tracing.RequestTracker - Tracking request sealSegment-testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0$attributes with id 1539198139871. Current ongoing requests: 801.
**segmentstore**:2018-10-10 19:02:22,610 6889024 [epollEventLoopGroup-3-2] INFO  i.p.s.s.h.h.PravegaRequestProcessor - [requestId=1539198139871] Sealing segment WireCommands.SealSegment(type=SEAL_SEGMENT, requestId=1539198139871, segment=testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0, delegationToken=)
**segmentstore**:2018-10-10 19:02:22,610 6889024 [epollEventLoopGroup-3-2] DEBUG i.p.s.s.c.StreamSegmentContainer - [requestId=1539198139871] SegmentContainer[1]: sealStreamSegment [testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0]
**segmentstore**:2018-10-10 19:02:22,630 6889044 [storage-io-13] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=1539198139871] Sealed active SegmentChunk 'testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0#transaction.00000000000000000000000000000080$offset.0' for 'testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0'.
**segmentstore**:2018-10-10 19:02:22,648 6889062 [storage-io-13] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=1539198139871] Sealed Header for 'testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0'.
**segmentstore**:2018-10-10 19:02:22,648 6889062 [storage-io-13] DEBUG i.p.common.tracing.RequestTracker - Untracking request sealSegment-testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0 with id [1539198139871]. Current ongoing requests: 800.
**segmentstore**:2018-10-10 19:02:23,117 6889531 [storage-io-9] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=1539198139871] Sealed active SegmentChunk 'testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0$attributes$offset.0' for 'testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0$attributes'.
**segmentstore**:2018-10-10 19:02:23,137 6889551 [storage-io-9] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=1539198139871] Sealed Header for 'testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0$attributes'.
**segmentstore**:2018-10-10 19:02:23,137 6889551 [storage-io-9] DEBUG i.p.common.tracing.RequestTracker - Untracking request sealSegment-testAutoScale382970353/testTxnScaleUpWithRead/0.#epoch.0$attributes with id [1539198139871]. Current ongoing requests: 793.
**segmentstore**:2018-10-10 19:02:25,738 6892152 [storage-io-4] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=1539198139871] Header for 'testAutoScale382970353/testTxnScaleUpWithRead/2.#epoch.1$attributes' fully serialized to 'testAutoScale382970353/testTxnScaleUpWithRead/2.#epoch.1$attributes$header'.
**segmentstore**:2018-10-10 19:02:25,738 6892152 [storage-io-4] DEBUG i.p.common.tracing.RequestTracker - Untracking request createSegment-testAutoScale382970353/testTxnScaleUpWithRead/2.#epoch.1$attributes with id [1539198139871]. Current ongoing requests: 810.
**segmentstore**:2018-10-10 19:02:25,754 6892168 [storage-io-17] DEBUG i.p.s.s.rolling.RollingStorage - [requestId=1539198139871] Header for 'testAutoScale382970353/testTxnScaleUpWithRead/3.#epoch.1$attributes' fully serialized to 'testAutoScale382970353/testTxnScaleUpWithRead/3.#epoch.1$attributes$header'.
**segmentstore**:2018-10-10 19:02:25,755 6892169 [storage-io-17] DEBUG i.p.common.tracing.RequestTracker - Untracking request createSegment-testAutoScale382970353/testTxnScaleUpWithRead/3.#epoch.1$attributes with id [1539198139871]. Current ongoing requests: 809.

Compatibility

RPC Controller API: Tags will be carried as headers, so no signature changes expected. The logic to treat requests tags should handle the situation in which no tags are sent by clients.

WireProtocol: Request id fields already exist on most of the important requests, so no incompatible changes are expected in the protocol.


Appendix: Discarded Approaches

Integrate a distributed tracing tool (Zipkin - Brave).

There are distributed tracing frameworks that already provide the above functionality to generate tags and attach them in RPC calls. Specifically, one of the most mature tracing frameworks for gRPC seems to be Zipkin/Brave (https://zipkin.io/). Apart from tagging requests, this framework can also provide latency metrics of Pravega RPC calls and store them in Logstash, which could be integrated as an additional source of information in our metrics framework. However, this would yield to introduce an extra third-party dependency in Pravega.

Use of Mapped Diagnostic Context

In general, MDC has been traditionally used as a way of tagging requests, as it stores a custom map within a thread’s local storage space that can be used for logging. This works correctly as long as there is relationship between a thread and a request. However, modern applications as Pravega make extensive user of Executors, which breaks the request-thread relationship. Therefore, as a request may be formed by a chain of different async computations executed by multiple threads, we can find consistency problems when working with MDC information. An illustrative explanation of this problem can be found here.

Pass “request tags” explicitly to async computations

This solution consists of explicitly passing the request tag to the initial stage of a chain of futures, so it may be used by subsequent async tasks. While being a simplistic approach, in some cases it may be sufficient as long as the changes in the signature of methods would not break public APIs.

Clone this wiki locally