Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix task bootstrapping & simplify segment load/drop flows #16475

Merged
merged 48 commits into from
Jun 4, 2024
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
Show all changes
48 commits
Select commit Hold shift + click to select a range
388e100
Fix task bootstrap locations.
abhishekrb19 May 20, 2024
0cd0f5a
Remove dependency of SegmentCacheManager from SegmentLoadDropHandler.
abhishekrb19 May 20, 2024
6ddcdd5
Clean up unused imports and stuff.
abhishekrb19 May 20, 2024
e91b82c
Test fixes.
abhishekrb19 May 20, 2024
6e563aa
Intellij inspections and test bind.
abhishekrb19 May 20, 2024
b242021
Clean up dependencies some more
abhishekrb19 May 21, 2024
9666512
Extract test load spec and factory to its own class.
abhishekrb19 May 21, 2024
e2feca5
Cleanup test util
abhishekrb19 May 21, 2024
fb018bb
Pull SegmentForTesting out to TestSegmentUtils.
abhishekrb19 May 21, 2024
a51d92a
Fix up.
abhishekrb19 May 21, 2024
e2b6f18
Minor changes to infoDir
abhishekrb19 May 21, 2024
633cc36
Merge branch 'master' into fixup_task_bootstrap_inject
abhishekrb19 May 21, 2024
92a7bac
Replace server announcer mock and verify that.
abhishekrb19 May 21, 2024
ee74c67
Add tests.
abhishekrb19 May 21, 2024
302834d
Merge branch 'master' into fixup_task_bootstrap_inject
abhishekrb19 May 22, 2024
8ad46b9
Update javadocs.
abhishekrb19 May 22, 2024
800c1cb
Address review comments.
abhishekrb19 May 24, 2024
0604c44
Separate methods for download and bootstrap load
abhishekrb19 May 24, 2024
83c2b59
Clean up return types and exception handling.
abhishekrb19 May 24, 2024
7fe4b91
No callback for loadSegment().
abhishekrb19 May 24, 2024
351bd0c
Minor cleanup
abhishekrb19 May 25, 2024
206ca82
Pull out the test helpers into its own static class so it can have be…
abhishekrb19 May 25, 2024
a7de2fa
LocalCacheManager stuff
abhishekrb19 May 25, 2024
c5acbea
Fix build.
abhishekrb19 May 25, 2024
980ffb8
Fix build.
abhishekrb19 May 25, 2024
9c38996
Address some CI warnings.
abhishekrb19 May 25, 2024
3f9862d
Merge branch 'fixup_task_bootstrap_inject' of github.com:abhishekrb19…
abhishekrb19 May 25, 2024
39b9c69
Minor updates to javadocs and test code.
abhishekrb19 May 28, 2024
0efbede
Merge branch 'master' into fixup_task_bootstrap_inject
abhishekrb19 May 28, 2024
1d18c4a
Address some CodeQL test warnings and checkstyle fix.
abhishekrb19 May 28, 2024
615bf09
Pass a Consumer<DataSegment> instead of boolean & rename variables.
abhishekrb19 May 28, 2024
1fd0c1e
Small updates
abhishekrb19 May 28, 2024
29b7815
Remove one test constructor.
abhishekrb19 May 28, 2024
9bf4ace
Remove the other constructor that wasn't initializing fully and updat…
abhishekrb19 May 28, 2024
c6b8cc8
Cleanup withInfoDir() builder and unnecessary test hooks.
abhishekrb19 May 29, 2024
16658e1
Remove mocks and elaborate on comments.
abhishekrb19 May 29, 2024
ae49b07
Merge branch 'master' into fixup_task_bootstrap_inject
abhishekrb19 May 29, 2024
e162311
Commentary
abhishekrb19 May 29, 2024
b6f8f20
Merge branch 'master' into fixup_task_bootstrap_inject
abhishekrb19 May 29, 2024
e02f5cb
Fix a few Intellij inspection warnings.
abhishekrb19 May 29, 2024
98db446
Suppress corePoolSize intellij-inspect warning.
abhishekrb19 May 29, 2024
8024d51
Update docs and add more tests.
abhishekrb19 Jun 3, 2024
da73aa6
Merge branch 'master' into fixup_task_bootstrap_inject
abhishekrb19 Jun 3, 2024
f11da12
Use hamcrest for asserting order on expectation.
abhishekrb19 Jun 3, 2024
b036808
Merge branch 'master' into fixup_task_bootstrap_inject
asdf2014 Jun 4, 2024
c2c9fd1
Shutdown bootstrap exec.
abhishekrb19 Jun 4, 2024
a16d69e
Merge branch 'fixup_task_bootstrap_inject' of github.com:abhishekrb19…
abhishekrb19 Jun 4, 2024
f97d10f
Fix checkstyle
abhishekrb19 Jun 4, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,7 @@
import org.apache.druid.segment.QueryableIndex;
import org.apache.druid.segment.Segment;
import org.apache.druid.segment.TestHelper;
import org.apache.druid.segment.TestIndex;
import org.apache.druid.segment.column.ColumnHolder;
import org.apache.druid.segment.data.Indexed;
import org.apache.druid.segment.data.ListIndexed;
Expand Down Expand Up @@ -144,6 +145,7 @@ public void setUp() throws Exception
new SegmentLoaderConfig().withLocations(
ImmutableList.of(new StorageLocationConfig(cacheDir, 10_000_000_000L, null))
),
TestIndex.INDEX_IO,
jsonMapper
);

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,7 @@
import org.apache.druid.segment.QueryableIndexStorageAdapter;
import org.apache.druid.segment.Segment;
import org.apache.druid.segment.StorageAdapter;
import org.apache.druid.segment.TestIndex;
import org.apache.druid.segment.column.ColumnConfig;
import org.apache.druid.segment.incremental.IncrementalIndexSchema;
import org.apache.druid.segment.loading.DataSegmentPusher;
Expand Down Expand Up @@ -157,7 +158,7 @@ public String getFormatString()
);
ObjectMapper testMapper = MSQTestBase.setupObjectMapper(dummyInjector);
IndexIO indexIO = new IndexIO(testMapper, ColumnConfig.DEFAULT);
SegmentCacheManager segmentCacheManager = new SegmentCacheManagerFactory(testMapper)
SegmentCacheManager segmentCacheManager = new SegmentCacheManagerFactory(TestIndex.INDEX_IO, testMapper)
.manufacturate(cacheManagerDir);
LocalDataSegmentPusherConfig config = new LocalDataSegmentPusherConfig();
MSQTestSegmentManager segmentManager = new MSQTestSegmentManager(segmentCacheManager, indexIO);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -139,6 +139,7 @@
import org.apache.druid.segment.QueryableIndexStorageAdapter;
import org.apache.druid.segment.Segment;
import org.apache.druid.segment.StorageAdapter;
import org.apache.druid.segment.TestIndex;
import org.apache.druid.segment.column.ColumnConfig;
import org.apache.druid.segment.column.RowSignature;
import org.apache.druid.segment.incremental.IncrementalIndexSchema;
Expand Down Expand Up @@ -423,7 +424,7 @@ public void setUp2() throws Exception
ObjectMapper secondMapper = setupObjectMapper(secondInjector);
indexIO = new IndexIO(secondMapper, ColumnConfig.DEFAULT);

segmentCacheManager = new SegmentCacheManagerFactory(secondMapper).manufacturate(newTempFolder("cacheManager"));
segmentCacheManager = new SegmentCacheManagerFactory(TestIndex.INDEX_IO, secondMapper).manufacturate(newTempFolder("cacheManager"));

MSQSqlModule sqlModule = new MSQSqlModule();

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@
import com.fasterxml.jackson.databind.ObjectMapper;
import com.google.inject.Inject;
import org.apache.druid.guice.annotations.Json;
import org.apache.druid.segment.IndexIO;
import org.apache.druid.segment.loading.SegmentCacheManager;
import org.apache.druid.segment.loading.SegmentLoaderConfig;
import org.apache.druid.segment.loading.SegmentLocalCacheManager;
Expand All @@ -35,13 +36,16 @@
*/
public class SegmentCacheManagerFactory
{
private final IndexIO indexIO;
private final ObjectMapper jsonMapper;

@Inject
public SegmentCacheManagerFactory(
IndexIO indexIO,
@Json ObjectMapper mapper
)
{
this.indexIO = indexIO;
this.jsonMapper = mapper;
}

Expand All @@ -50,6 +54,7 @@ public SegmentCacheManager manufacturate(File storageDir)
return new SegmentLocalCacheManager(
new SegmentLoaderConfig().withLocations(
Collections.singletonList(new StorageLocationConfig(storageDir, null, null))),
indexIO,
jsonMapper
);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -115,6 +115,7 @@
import org.apache.druid.query.timeseries.TimeseriesResultValue;
import org.apache.druid.segment.SegmentSchemaMapping;
import org.apache.druid.segment.TestHelper;
import org.apache.druid.segment.TestIndex;
import org.apache.druid.segment.handoff.SegmentHandoffNotifier;
import org.apache.druid.segment.handoff.SegmentHandoffNotifierFactory;
import org.apache.druid.segment.incremental.RowIngestionMeters;
Expand Down Expand Up @@ -1638,7 +1639,7 @@ public void close()
DirectQueryProcessingPool.INSTANCE, // queryExecutorService
NoopJoinableFactory.INSTANCE,
() -> EasyMock.createMock(MonitorScheduler.class),
new SegmentCacheManagerFactory(testUtils.getTestObjectMapper()),
new SegmentCacheManagerFactory(TestIndex.INDEX_IO, testUtils.getTestObjectMapper()),
testUtils.getTestObjectMapper(),
testUtils.getTestIndexIO(),
MapCache.create(1024),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,7 @@
import org.apache.druid.query.filter.SelectorDimFilter;
import org.apache.druid.rpc.indexing.OverlordClient;
import org.apache.druid.segment.IndexSpec;
import org.apache.druid.segment.TestIndex;
import org.apache.druid.segment.data.CompressionFactory.LongEncodingStrategy;
import org.apache.druid.segment.data.CompressionStrategy;
import org.apache.druid.segment.incremental.OnheapIncrementalIndex;
Expand Down Expand Up @@ -176,7 +177,7 @@ private static ObjectMapper setupInjectablesInObjectMapper(ObjectMapper objectMa
binder.bind(ChatHandlerProvider.class).toInstance(new NoopChatHandlerProvider());
binder.bind(RowIngestionMetersFactory.class).toInstance(ROW_INGESTION_METERS_FACTORY);
binder.bind(CoordinatorClient.class).toInstance(COORDINATOR_CLIENT);
binder.bind(SegmentCacheManagerFactory.class).toInstance(new SegmentCacheManagerFactory(objectMapper));
binder.bind(SegmentCacheManagerFactory.class).toInstance(new SegmentCacheManagerFactory(TestIndex.INDEX_IO, objectMapper));
binder.bind(AppenderatorsManager.class).toInstance(APPENDERATORS_MANAGER);
binder.bind(OverlordClient.class).toInstance(new NoopOverlordClient());
}
Expand Down Expand Up @@ -336,7 +337,7 @@ private CompactionTask createCompactionTask(ClientCompactionTaskTransformSpec tr
{
CompactionTask.Builder compactionTaskBuilder = new CompactionTask.Builder(
"datasource",
new SegmentCacheManagerFactory(MAPPER),
new SegmentCacheManagerFactory(TestIndex.INDEX_IO, MAPPER),
new RetryPolicyFactory(new RetryPolicyConfig())
)
.inputSpec(new CompactionIntervalSpec(Intervals.of("2019/2020"), "testSha256OfSortedSegmentIds"), true)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -79,6 +79,7 @@
import org.apache.druid.segment.DimensionSelector;
import org.apache.druid.segment.IndexSpec;
import org.apache.druid.segment.QueryableIndexStorageAdapter;
import org.apache.druid.segment.TestIndex;
import org.apache.druid.segment.VirtualColumns;
import org.apache.druid.segment.column.ColumnType;
import org.apache.druid.segment.indexing.granularity.UniformGranularitySpec;
Expand Down Expand Up @@ -206,7 +207,7 @@ public ListenableFuture<List<DataSegment>> fetchUsedSegments(
);
}
};
segmentCacheManagerFactory = new SegmentCacheManagerFactory(getObjectMapper());
segmentCacheManagerFactory = new SegmentCacheManagerFactory(TestIndex.INDEX_IO, getObjectMapper());
this.lockGranularity = lockGranularity;
}

Expand Down Expand Up @@ -2073,6 +2074,7 @@ public List<StorageLocationConfig> getLocations()
return ImmutableList.of(new StorageLocationConfig(localDeepStorage, null, null));
}
},
TestIndex.INDEX_IO,
objectMapper
);

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -106,6 +106,7 @@
import org.apache.druid.segment.QueryableIndex;
import org.apache.druid.segment.SegmentUtils;
import org.apache.druid.segment.SimpleQueryableIndex;
import org.apache.druid.segment.TestIndex;
import org.apache.druid.segment.column.BaseColumn;
import org.apache.druid.segment.column.ColumnCapabilities;
import org.apache.druid.segment.column.ColumnCapabilitiesImpl;
Expand Down Expand Up @@ -301,7 +302,7 @@ private static ObjectMapper setupInjectablesInObjectMapper(ObjectMapper objectMa
binder.bind(RowIngestionMetersFactory.class).toInstance(TEST_UTILS.getRowIngestionMetersFactory());
binder.bind(CoordinatorClient.class).toInstance(COORDINATOR_CLIENT);
binder.bind(SegmentCacheManagerFactory.class)
.toInstance(new SegmentCacheManagerFactory(objectMapper));
.toInstance(new SegmentCacheManagerFactory(TestIndex.INDEX_IO, objectMapper));
binder.bind(AppenderatorsManager.class).toInstance(new TestAppenderatorsManager());
}
)
Expand Down Expand Up @@ -391,7 +392,7 @@ public void setup()
SEGMENT_MAP
);
Mockito.when(clock.millis()).thenReturn(0L, 10_000L);
segmentCacheManagerFactory = new SegmentCacheManagerFactory(OBJECT_MAPPER);
segmentCacheManagerFactory = new SegmentCacheManagerFactory(TestIndex.INDEX_IO, OBJECT_MAPPER);
}

@Test
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,7 @@
import org.apache.druid.segment.IndexSpec;
import org.apache.druid.segment.QueryableIndexStorageAdapter;
import org.apache.druid.segment.SegmentSchemaMapping;
import org.apache.druid.segment.TestIndex;
import org.apache.druid.segment.VirtualColumns;
import org.apache.druid.segment.column.ColumnType;
import org.apache.druid.segment.column.RowSignature;
Expand Down Expand Up @@ -207,6 +208,7 @@ public List<StorageLocationConfig> getLocations()
);
}
},
TestIndex.INDEX_IO,
jsonMapper
);
taskRunner = new TestTaskRunner();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -76,6 +76,7 @@
import org.apache.druid.segment.IndexIO;
import org.apache.druid.segment.IndexMergerV9Factory;
import org.apache.druid.segment.SegmentSchemaMapping;
import org.apache.druid.segment.TestIndex;
import org.apache.druid.segment.incremental.RowIngestionMetersFactory;
import org.apache.druid.segment.join.NoopJoinableFactory;
import org.apache.druid.segment.loading.LocalDataSegmentPusher;
Expand Down Expand Up @@ -166,7 +167,7 @@ public void setUpIngestionTestBase() throws IOException
CentralizedDatasourceSchemaConfig.create()
);
lockbox = new TaskLockbox(taskStorage, storageCoordinator);
segmentCacheManagerFactory = new SegmentCacheManagerFactory(getObjectMapper());
segmentCacheManagerFactory = new SegmentCacheManagerFactory(TestIndex.INDEX_IO, getObjectMapper());
reportsFile = temporaryFolder.newFile();
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -101,6 +101,7 @@
import org.apache.druid.query.timeseries.TimeseriesQueryRunnerFactory;
import org.apache.druid.query.timeseries.TimeseriesResultValue;
import org.apache.druid.segment.TestHelper;
import org.apache.druid.segment.TestIndex;
import org.apache.druid.segment.handoff.SegmentHandoffNotifier;
import org.apache.druid.segment.handoff.SegmentHandoffNotifierFactory;
import org.apache.druid.segment.indexing.DataSchema;
Expand Down Expand Up @@ -997,7 +998,7 @@ public void close()
DirectQueryProcessingPool.INSTANCE,
NoopJoinableFactory.INSTANCE,
() -> EasyMock.createMock(MonitorScheduler.class),
new SegmentCacheManagerFactory(testUtils.getTestObjectMapper()),
new SegmentCacheManagerFactory(TestIndex.INDEX_IO, testUtils.getTestObjectMapper()),
testUtils.getTestObjectMapper(),
testUtils.getTestIndexIO(),
MapCache.create(1024),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,7 @@
import org.apache.druid.segment.DataSegmentsWithSchemas;
import org.apache.druid.segment.Segment;
import org.apache.druid.segment.SegmentLazyLoadFailCallback;
import org.apache.druid.segment.TestIndex;
import org.apache.druid.segment.indexing.DataSchema;
import org.apache.druid.segment.indexing.granularity.GranularitySpec;
import org.apache.druid.segment.indexing.granularity.UniformGranularitySpec;
Expand Down Expand Up @@ -305,7 +306,7 @@ List<ScanResultValue> querySegment(DataSegment dataSegment, List<String> columns

private Segment loadSegment(DataSegment dataSegment, File tempSegmentDir)
{
final SegmentCacheManager cacheManager = new SegmentCacheManagerFactory(getObjectMapper())
final SegmentCacheManager cacheManager = new SegmentCacheManagerFactory(TestIndex.INDEX_IO, getObjectMapper())
.manufacturate(tempSegmentDir);
final SegmentLoader loader = new SegmentLocalCacheLoader(cacheManager, getIndexIO(), getObjectMapper());
try {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -85,6 +85,7 @@
import org.apache.druid.query.expression.LookupEnabledTestExprMacroTable;
import org.apache.druid.segment.DataSegmentsWithSchemas;
import org.apache.druid.segment.IndexIO;
import org.apache.druid.segment.TestIndex;
import org.apache.druid.segment.incremental.ParseExceptionReport;
import org.apache.druid.segment.incremental.RowIngestionMetersFactory;
import org.apache.druid.segment.incremental.RowIngestionMetersTotals;
Expand Down Expand Up @@ -691,7 +692,7 @@ public void prepareObjectMapper(ObjectMapper objectMapper, IndexIO indexIO)
.addValue(AppenderatorsManager.class, TestUtils.APPENDERATORS_MANAGER)
.addValue(LocalDataSegmentPuller.class, new LocalDataSegmentPuller())
.addValue(CoordinatorClient.class, coordinatorClient)
.addValue(SegmentCacheManagerFactory.class, new SegmentCacheManagerFactory(objectMapper))
.addValue(SegmentCacheManagerFactory.class, new SegmentCacheManagerFactory(TestIndex.INDEX_IO, objectMapper))
.addValue(RetryPolicyFactory.class, new RetryPolicyFactory(new RetryPolicyConfig()))
.addValue(TaskConfig.class, taskConfig)
);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,7 @@
import org.apache.druid.query.QueryRunner;
import org.apache.druid.query.scan.ScanResultValue;
import org.apache.druid.query.spec.MultipleIntervalSegmentSpec;
import org.apache.druid.segment.TestIndex;
import org.apache.druid.segment.join.NoopJoinableFactory;
import org.apache.druid.segment.loading.NoopDataSegmentArchiver;
import org.apache.druid.segment.loading.NoopDataSegmentKiller;
Expand Down Expand Up @@ -114,7 +115,7 @@ public void setup() throws IOException
null,
NoopJoinableFactory.INSTANCE,
null,
new SegmentCacheManagerFactory(utils.getTestObjectMapper()),
new SegmentCacheManagerFactory(TestIndex.INDEX_IO, utils.getTestObjectMapper()),
utils.getTestObjectMapper(),
utils.getTestIndexIO(),
null,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -127,6 +127,7 @@
import org.apache.druid.segment.IndexSpec;
import org.apache.druid.segment.SegmentSchemaMapping;
import org.apache.druid.segment.TestHelper;
import org.apache.druid.segment.TestIndex;
import org.apache.druid.segment.handoff.SegmentHandoffNotifier;
import org.apache.druid.segment.handoff.SegmentHandoffNotifierFactory;
import org.apache.druid.segment.indexing.DataSchema;
Expand Down Expand Up @@ -646,7 +647,7 @@ public void announceSegment(DataSegment segment)
DirectQueryProcessingPool.INSTANCE, // query executor service
NoopJoinableFactory.INSTANCE,
() -> monitorScheduler, // monitor scheduler
new SegmentCacheManagerFactory(new DefaultObjectMapper()),
new SegmentCacheManagerFactory(TestIndex.INDEX_IO, new DefaultObjectMapper()),
MAPPER,
INDEX_IO,
MapCache.create(0),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,7 @@
import org.apache.druid.segment.IndexIO;
import org.apache.druid.segment.IndexMergerV9Factory;
import org.apache.druid.segment.TestHelper;
import org.apache.druid.segment.TestIndex;
import org.apache.druid.segment.handoff.SegmentHandoffNotifierFactory;
import org.apache.druid.segment.incremental.RowIngestionMetersFactory;
import org.apache.druid.segment.join.JoinableFactory;
Expand Down Expand Up @@ -140,7 +141,7 @@ public static class Builder
private Provider<MonitorScheduler> monitorSchedulerProvider;
private ObjectMapper jsonMapper = TestHelper.JSON_MAPPER;
private IndexIO indexIO = TestHelper.getTestIndexIO();
private SegmentCacheManagerFactory segmentCacheManagerFactory = new SegmentCacheManagerFactory(jsonMapper);
private SegmentCacheManagerFactory segmentCacheManagerFactory = new SegmentCacheManagerFactory(TestIndex.INDEX_IO, jsonMapper);
private Cache cache;
private CacheConfig cacheConfig;
private CachePopulatorStats cachePopulatorStats;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -105,6 +105,7 @@
import org.apache.druid.segment.DimensionHandlerUtils;
import org.apache.druid.segment.IndexIO;
import org.apache.druid.segment.QueryableIndex;
import org.apache.druid.segment.TestIndex;
import org.apache.druid.segment.column.DictionaryEncodedColumn;
import org.apache.druid.segment.handoff.SegmentHandoffNotifier;
import org.apache.druid.segment.handoff.SegmentHandoffNotifierFactory;
Expand Down Expand Up @@ -684,7 +685,7 @@ public void close()
DirectQueryProcessingPool.INSTANCE,
NoopJoinableFactory.INSTANCE,
() -> EasyMock.createMock(MonitorScheduler.class),
new SegmentCacheManagerFactory(objectMapper),
new SegmentCacheManagerFactory(TestIndex.INDEX_IO, objectMapper),
objectMapper,
testUtils.getTestIndexIO(),
MapCache.create(1024),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,7 @@
import org.apache.druid.rpc.indexing.OverlordClient;
import org.apache.druid.segment.IndexIO;
import org.apache.druid.segment.IndexMergerV9Factory;
import org.apache.druid.segment.TestIndex;
import org.apache.druid.segment.handoff.SegmentHandoffNotifierFactory;
import org.apache.druid.segment.join.NoopJoinableFactory;
import org.apache.druid.segment.metadata.CentralizedDatasourceSchemaConfig;
Expand Down Expand Up @@ -145,7 +146,7 @@ private WorkerTaskManager createWorkerTaskManager()
null,
NoopJoinableFactory.INSTANCE,
null,
new SegmentCacheManagerFactory(jsonMapper),
new SegmentCacheManagerFactory(TestIndex.INDEX_IO, jsonMapper),
jsonMapper,
indexIO,
null,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,7 @@
import org.apache.druid.rpc.indexing.OverlordClient;
import org.apache.druid.segment.IndexIO;
import org.apache.druid.segment.IndexMergerV9Factory;
import org.apache.druid.segment.TestIndex;
import org.apache.druid.segment.handoff.SegmentHandoffNotifierFactory;
import org.apache.druid.segment.join.NoopJoinableFactory;
import org.apache.druid.segment.metadata.CentralizedDatasourceSchemaConfig;
Expand Down Expand Up @@ -187,7 +188,7 @@ private WorkerTaskMonitor createTaskMonitor()
null,
NoopJoinableFactory.INSTANCE,
null,
new SegmentCacheManagerFactory(jsonMapper),
new SegmentCacheManagerFactory(TestIndex.INDEX_IO, jsonMapper),
jsonMapper,
indexIO,
null,
Expand Down