Skip to content

Error 'File recursively imports itself' with Kafka Engine and ProtobufSingle format #45875

@rgushel

Description

@rgushel

You have to provide the following information whenever possible.

Table config:

CREATE TABLE raw_stats_queue
(
    timestamp   DateTime,
    deviceModel String,
    userId      String,
    impressions UInt64
) ENGINE = Kafka()
      SETTINGS
          kafka_broker_list = 'kafka:9092',
          kafka_topic_list = 'impression',
          kafka_group_name = 'clickhouse-stats',
          kafka_format = 'ProtobufSingle',
          kafka_schema = 'stats.proto:Event';

Proto file:

syntax = "proto3";

import "google/protobuf/timestamp.proto";

option java_package = "io.sponsorcart.server.model.generated";
option java_outer_classname = "StatsProtos";

message Event {
  google.protobuf.Timestamp timestamp = 1; <--- Without timestamp and import works fine
  string deviceModel = 3;
  string userId = 4;
  int64 impressions = 5;
}

Trace:

2023.01.31 23:15:45.753752 [ 252 ] {} <Error> void DB::StorageKafka::threadFunc(size_t): Code: 434. DB::Exception: Cannot parse 'stats.proto' file, found an error at line -1, column 0, File recursively imports itself: stats.proto -> stats.proto: While executing Kafka. (CANNOT_PARSE_PROTOBUF_SCHEMA), Stack trace (when copying this message, always include the lines below):

0. DB::Exception::Exception(DB::Exception::MessageMasked&&, int, bool) @ 0xbb157c4 in /usr/bin/clickhouse
1. ? @ 0x11551af4 in /usr/bin/clickhouse
2. google::protobuf::compiler::SourceTreeDescriptorDatabase::ValidationErrorCollector::AddError(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>> const&, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>> const&, google::protobuf::Message const*, google::protobuf::DescriptorPool::ErrorCollector::ErrorLocation, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>> const&) @ 0x13672488 in /usr/bin/clickhouse
3. google::protobuf::DescriptorBuilder::AddError(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>> const&, google::protobuf::Message const&, google::protobuf::DescriptorPool::ErrorCollector::ErrorLocation, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>> const&) @ 0x1369f90c in /usr/bin/clickhouse
4. google::protobuf::DescriptorBuilder::AddRecursiveImportError(google::protobuf::FileDescriptorProto const&, int) @ 0x136a44c4 in /usr/bin/clickhouse
5. google::protobuf::DescriptorBuilder::BuildFile(google::protobuf::FileDescriptorProto const&) @ 0x1369f090 in /usr/bin/clickhouse
6. google::protobuf::DescriptorPool::BuildFileFromDatabase(google::protobuf::FileDescriptorProto const&) const @ 0x1369314c in /usr/bin/clickhouse
7. google::protobuf::DescriptorPool::TryFindFileInFallbackDatabase(google::protobuf::stringpiece_internal::StringPiece) const @ 0x1368ff5c in /usr/bin/clickhouse
8. google::protobuf::DescriptorPool::FindFileByName(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>> const&) const @ 0x1368fd2c in /usr/bin/clickhouse
9. ? @ 0x115512a0 in /usr/bin/clickhouse
10. DB::ProtobufSchemas::getMessageTypeForFormatSchema(DB::FormatSchemaInfo const&, DB::ProtobufSchemas::WithEnvelope) @ 0x11551154 in /usr/bin/clickhouse
11. DB::ProtobufRowInputFormat::ProtobufRowInputFormat(DB::ReadBuffer&, DB::Block const&, DB::RowInputFormatParams const&, DB::FormatSchemaInfo const&, bool, bool) @ 0x11553218 in /usr/bin/clickhouse
12. ? @ 0x115541bc in /usr/bin/clickhouse
13. DB::FormatFactory::getInputFormat(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>> const&, DB::ReadBuffer&, DB::Block const&, std::__1::shared_ptr<DB::Context const>, unsigned long, std::__1::optional<DB::FormatSettings> const&) const @ 0x113bc40c in /usr/bin/clickhouse
14. DB::KafkaSource::generateImpl() @ 0x10d6a7cc in /usr/bin/clickhouse
15. DB::KafkaSource::generate() @ 0x10d6d598 in /usr/bin/clickhouse
16. DB::ISource::tryGenerate() @ 0x113d8c0c in /usr/bin/clickhouse
17. DB::ISource::work() @ 0x113d8710 in /usr/bin/clickhouse
18. DB::ExecutionThreadContext::executeTask() @ 0x113ef280 in /usr/bin/clickhouse
19. DB::PipelineExecutor::executeStepImpl(unsigned long, std::__1::atomic<bool>*) @ 0x113e51ec in /usr/bin/clickhouse
20. DB::PipelineExecutor::executeImpl(unsigned long) @ 0x113e4474 in /usr/bin/clickhouse
21. DB::PipelineExecutor::execute(unsigned long) @ 0x113e416c in /usr/bin/clickhouse
22. DB::CompletedPipelineExecutor::execute() @ 0x113e2ad4 in /usr/bin/clickhouse
23. DB::StorageKafka::streamToViews() @ 0x10d5d008 in /usr/bin/clickhouse
24. DB::StorageKafka::threadFunc(unsigned long) @ 0x10d5abb0 in /usr/bin/clickhouse
25. DB::BackgroundSchedulePoolTaskInfo::execute() @ 0xf768a5c in /usr/bin/clickhouse
26. DB::BackgroundSchedulePool::threadFunction() @ 0xf76ba48 in /usr/bin/clickhouse
27. ? @ 0xf76c8b8 in /usr/bin/clickhouse
28. ThreadPoolImpl<std::__1::thread>::worker(std::__1::__list_iterator<std::__1::thread, void*>) @ 0xbbccca8 in /usr/bin/clickhouse
29. ? @ 0xbbd1868 in /usr/bin/clickhouse
30. start_thread @ 0x7624 in /lib/libpthread.so.0
31. ? @ 0xd149c in /lib/libc.so.6
 (version 23.1.2.9 (official build))

A clear and concise description of what works not as it is supposed to.

Clickhouse data ingestion from Kafka topic doesn't work with some proto files

Add any other context about the problem here.

Docker clickhouse/clickhouse-server:23.1.2.9-alpine

Metadata

Metadata

Assignees

No one assigned

    Labels

    comp-kafkaKafka Enginepotential bugTo be reviewed by developers and confirmed/rejected.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions