Skip to content

Nodes with inaccessible http.host and connection sniffing #70

@jimmyjones2

Description

@jimmyjones2

I've setup a client node on my Kibana server, bound to loopback for security, but ever since got lots of errors from my perl script doing the data ingest. I added a network interface on the private cluster network to the Kibana server, alongside its normal interface. However it seems that the Perl client, connected to the cluster network, gets this inaccessible IP when sniffing, tries to connect to it at some point and bombs out.

node.master: false
node.data: false
network.host: "10.0.0.4"
http.host: "127.0.0.1"

Repro:

#!/usr/bin/perl

use Search::Elasticsearch;

my $es = Search::Elasticsearch->new( cxn_pool => 'Sniff', nodes => 'localhost:9200' );
for (1..10) {
  $es->index( index => 'test', type => 'test', body => { foo => 'bar' } );
}

As a workaround I can remove the http.host and leave it on the private interface, but I don't really want indexing traffic going through my underpowered web server! Shouldn't the Perl client ignore the inaccessible address rather than bombing out?

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions