Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Logging to Beats Endpoint (Logstash) and NET::ERR_CERT_INVALID #35

Closed
poorejc opened this issue Mar 10, 2021 · 7 comments
Closed

Logging to Beats Endpoint (Logstash) and NET::ERR_CERT_INVALID #35

poorejc opened this issue Mar 10, 2021 · 7 comments
Labels
question Further information is requested

Comments

@poorejc
Copy link

poorejc commented Mar 10, 2021

Hi!

Great stack! I was able to get your stack running on a Linux VM using docker-machine and virtual box (I still like docker-machine). Running Docker Engine and Compose @ latest through Docker Desktop on Mac. Stack is running well and builds well. Two questions:

I'm trying to send json logs to the beats input port (5044), but doesn't look like I'm getting anything through at http://localhost:5044 or https://localhost:5044. Any quick tips for how to push data to the beats endpoints (longtime ELK user, but have never used with security enabled). Do I need to add my own custom endpoint to the inputs configs for LS?

Kibana is not in a ready state when I navigate to https://localhost:5601. Likely, this is because ES hasn't receive any data. However, I'm gettting a lot of err_cert_invalid errors in Chrome. Forced to push through these in MozFF. Any thoughts on the certs?

@poorejc poorejc added the question Further information is requested label Mar 10, 2021
@dgknuth
Copy link

dgknuth commented Mar 15, 2021

I'm having a similar issue. The defaults build for this creates all of the certs for Kibana and Logstash assuming localhost. However, unless I map localhost to the server's actual IP, nothing can reach Kibana, or any of the other ELK stuff.

If I do map through the hosts file, I can access everything, but when I attempt to set up Beats to send data and find the Kibana instance, it fails telling me that the Kibana cert is not valid for the server, it's only valid for Localhost/127.0.0.1.

If I generate certs for kibana with the proper hostname and such, and I plug those into the Kibana config file, when I start up the stack, the connectivity back to Kibana through either a browser or through beats config error with a connection refused.

For the life of me, I'm attempting to sort out just how to get either proper certs, or how to get around the issue with ingest for beats, or what have you.

@sherifabdlnaby
Copy link
Owner

Trying to send events through logstash is working fine with me. I did it using a container on the same docker-compose network using the following output configuration:

output.logstash:
  hosts: ["logstash:5044"]

One thing I noticed in your comment is using http://localhost:5044 or https://localhost:5044 ... this should work. the connection to Logstash beats is a TCP connection not HTTP.

For Kibana, it should open if ES has no data. something might be corrupted in your setup, please check and share the logs.
As for the certs, it is a recent addition to chrome that it doesn't trust localhost's self signed certs :) Nothing can be done about it unless you're using a non-self signed cert on a webserver, or configured your OS to trust this self signed cert.

@HedgeShot
Copy link

Hi,

I also have trouble sending logs to logstash. The ELK stack is up and running, I can access Kibana's interface.
When sending logs to logstash, logstash container logs show: Invalid version of beats protocol

I did a very simple test by adding a dummy container to the docker-compose available in this repo, so that I am extra sure of being on the same network:

  testlogs:
    image: hello-world
    container_name: testlogs
    restart: "no"
    networks:
      - default
    logging:
      driver: syslog
      options:
        syslog-address: "tcp://127.0.0.1:5044"

Also note that I tried to replace 127.0.0.1 by "logstash" as suggested above and also by "elastic_logstash_1" but the dummy container failed to boot: failed to initialize logging driver: dial tcp: no such host

Any suggestions?

@sherifabdlnaby
Copy link
Owner

I would like to note two points:

  1. If your container can't connect to logstash via logstash hostname, then it must have been started in another network other than the docker network for elastdocker ( by default its name is elastic), it is reachable via localhost because the container exposes the same port to localhost. so please double check the networking setup.
  2. The endpoint at 5044 is a TCP Connection of beats protocol and not syslog. you cannot use syslog logging driver to send to logstash directly via beats protocol. for this, you'll need the Syslog Input plugin

@HedgeShot
Copy link

Thanks for your input but I still can't make it work. The dummy container is indeed on the elastic network (testlogs):

elastdocker % docker network inspect elastic              
[
    {
        "Name": "elastic",
        "Id": "fd7c3fa8661578c63e637295c729891e0e90e00c317d35f7f6ef2f4358fa089f",
        "Created": "2021-03-23T15:35:04.1849213Z",
        "Scope": "local",
        "Driver": "bridge",
        "EnableIPv6": false,
        "IPAM": {
            "Driver": "default",
            "Options": null,
            "Config": [
                {
                    "Subnet": "172.31.0.0/16",
                    "Gateway": "172.31.0.1"
                }
            ]
        },
        "Internal": false,
        "Attachable": true,
        "Ingress": false,
        "ConfigFrom": {
            "Network": ""
        },
        "ConfigOnly": false,
        "Containers": {
            "36fdb41e1c9b5aba91380c4e51a9551492ee1334460da6945af6b29eec428580": {
                "Name": "elastic_logstash_1",
                "EndpointID": "3a1b0bea9b4c56bd7328de9b4c48d76212ef364b9da38cd8579e633d5162db2d",
                "MacAddress": "02:42:ac:1f:00:02",
                "IPv4Address": "172.31.0.2/16",
                "IPv6Address": ""
            },
            "78bc3e5064051bb016f9ba7d2fdaa04d848f39849933a326534fda84dca95d17": {
                "Name": "elastic_kibana_1",
                "EndpointID": "3caa66907b69f394914c5e71ee18a264628ab38b24d0ac65c821d5d8fd15de9c",
                "MacAddress": "02:42:ac:1f:00:03",
                "IPv4Address": "172.31.0.3/16",
                "IPv6Address": ""
            },
            "c4c5a75f94529c98e2ccf626451b5494d6da919a0ac48a38bf563735a392bdc1": {
                "Name": "testlogs",
                "EndpointID": "201896dd410775993cefd96377f305285ea159ebb55c9577501e2872e996b2f5",
                "MacAddress": "02:42:ac:1f:00:05",
                "IPv4Address": "172.31.0.5/16",
                "IPv6Address": ""
            },
            "eb59fe2d79ac913998a25afe59422e5f641200d57cc25de6f4cb17c94e65aaf5": {
                "Name": "elastic_elasticsearch_1",
                "EndpointID": "33ba6f7918c5598d646514cffd817be23d694ad02dc817cae6cec0aac413c26b",
                "MacAddress": "02:42:ac:1f:00:04",
                "IPv4Address": "172.31.0.4/16",
                "IPv6Address": ""
            }
        },
        "Options": {},
        "Labels": {
            "com.docker.compose.network": "elastic",
            "com.docker.compose.project": "elastic",
            "com.docker.compose.version": "1.28.5"
        }
    }
]

Following your comment, I added the following line to the Dockerfile of logstash:
RUN logstash-plugin install logstash-input-syslog

I added this part in the docker-compose file provided in this repo:

testlogs:
    image: testlogs
    container_name: testlogs
    restart: "no"
    volumes:
      - ./dummyapp:/app
    depends_on:
      - logstash
    command: "python3 app.py"
    ports:
      - 5003:5000
    logging:
      driver: syslog
      options:
        syslog-address: "tcp://127.0.0.1:5044"

where testlogs is a simple python webapp, the app has a button that print a dummy text in the docker logs. I get nothing in logstash (not even an error).

@HedgeShot
Copy link

Update:
I could make it work by using gelf driver instead. The only annoying part is that I need to set 127.0.0.1 for the IP address of logstash. If I set logstash or localhost, if fails or doesn't show anything in Kibana.

@sherifabdlnaby
Copy link
Owner

Aha, sorry I missed this point, you're sending the logs via Docker's driver, if you think about it, the driver itself is not part of the container's network, it is a construct of the host. And that's why the driver (being on the host) won't have access to hostname logstash that is only at the container network.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants