Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Errno::ENOENT: No such file or directory #41

Closed
jdboix opened this issue Aug 5, 2019 · 2 comments
Closed

Errno::ENOENT: No such file or directory #41

jdboix opened this issue Aug 5, 2019 · 2 comments

Comments

@jdboix
Copy link

jdboix commented Aug 5, 2019

Hi all,

I upgraded my logstash image to 7.2.0 and when the pipeline restarted, it started throwing an Errno::ENOENT: No such file or directory error, where it seems to be trying to access/upload a temporary log file that doesn't exist any longer.

Version: 4.0.1

Output Section:

output {
  if [type] == "log" {
    google_cloud_storage { 
      bucket => "my-bucket"
      log_file_prefix => "logs/%{+YYYY/MM/dd}"
      codec => "json_lines"
      gzip => true
      max_file_size_kbytes => 5242
    }
  }

  if [type] == "raw-log" {
    google_cloud_storage { 
      bucket => "my-bucket"
      log_file_prefix => "logs/raw/%{+YYYY/MM/dd}/"
      codec => "json_lines"
      gzip => true
      max_file_size_kbytes => 5242
    }
  }
}

Error trace

Pipeline aborted due to error 
{
  :pipeline_id=>"main", 
  :exception=>
    #<Errno::ENOENT: No such file or directory - 
    /tmp/logstash-gcs-504f0f1052346b639f01477069a72f1f76be227ff8e41870e5f7859b5be1/logs/%{+YYYY/MM/dd}_logstasher-785c699966-sg4c9_2019-08-05T19:00.part001.log.gz>, 
  :backtrace=>[
    "org/jruby/RubyIO.java:1236:in `sysopen'", 
    "org/jruby/RubyFile.java:367:in `initialize'", 
    "org/jruby/RubyIO.java:876:in `new'", 
    "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-google_cloud_storage-4.0.1-java/lib/logstash/outputs/gcs/temp_log_file.rb:29:in `initialize'", 
    "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-google_cloud_storage-4.0.1-java/lib/logstash/outputs/gcs/temp_log_file.rb:14:in `create'", 
    "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-google_cloud_storage-4.0.1-java/lib/logstash/outputs/gcs/log_rotate.rb:51:in `block in rotate_log!'", 
    "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/concurrent-ruby-1.1.5/lib/concurrent/atomic/reentrant_read_write_lock.rb:147:in `with_write_lock'", 
    "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-google_cloud_storage-4.0.1-java/lib/logstash/outputs/gcs/log_rotate.rb:42:in `rotate_log!'", 
    "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-google_cloud_storage-4.0.1-java/lib/logstash/outputs/gcs/log_rotate.rb:19:in `initialize'", 
    "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-google_cloud_storage-4.0.1-java/lib/logstash/outputs/google_cloud_storage.rb:266:in `initialize_log_rotater'", 
    "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-google_cloud_storage-4.0.1-java/lib/logstash/outputs/google_cloud_storage.rb:164:in `register'", 
    "org/logstash/config/ir/compiler/OutputStrategyExt.java:106:in `register'", 
    "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:48:in `register'", 
    "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:192:in `block in register_plugins'", 
    "org/jruby/RubyArray.java:1792:in `each'", 
    "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:191:in `register_plugins'", 
    "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:462:in `maybe_setup_out_plugins'", 
    "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:204:in `start_workers'", 
    "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:146:in `run'", 
    "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:105:in `block in start'"
  ], 
  :thread=>"#<Thread:0x7f6c858e run>"
}
@colinsurprenant
Copy link
Contributor

Hi @jdboix - thanks for the report.
The problem here is that the log_file_prefix option does not support string interpolation so %{+YYYY/MM/dd} is not transformed as you can see in this error log

 #<Errno::ENOENT: No such file or directory - /tmp/logstash-gcs-504f0f1052346b639f01477069a72f1f76be227ff8e41870e5f7859b5be1/logs/%{+YYYY/MM/dd}_logstasher-785c699966-sg4c9_2019-08-05T19:00.part001.log.gz>, 

What you probably want to use are the temp_directory and date_pattern options. You can get more information on the supported options in https://www.elastic.co/guide/en/logstash/7.3/plugins-outputs-google_cloud_storage.html.

One improvement here is that we could provide a better exception message. I created #44 to track this issue.

@colinsurprenant
Copy link
Contributor

Oh @jdboix - I forgot to mention that the codec option is only supported since the v4.1.0 of the plugin, with earlier versions of the plugin you need to use the output_format option.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants