Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

-log-file option not respecting log-level #4778

Closed
uyoahz opened this issue Oct 10, 2018 · 20 comments
Closed

-log-file option not respecting log-level #4778

uyoahz opened this issue Oct 10, 2018 · 20 comments
Labels
good first issue A well-defined bug or improvement with sufficient context which should be approachable for new contr type/bug Feature does not function as expected
Milestone

Comments

@uyoahz
Copy link

uyoahz commented Oct 10, 2018

After I started up Consul server (v1.2.3) with option log-level="INFO" or "WARN", I can still see many debug level logs outputted. Can anyone help to check it out?

This is my configs.
bash-4.2$ cat consul-config.json
{
"datacenter": "ssc",
"data_dir": "/opt/data/ssc/consul",
"log_level": "WARN",
"disable_host_node_id" : true,
"server": true,
"client_addr": "0.0.0.0",
"ui": true,
"bootstrap_expect": 3
}

And this is the logs.
2018/10/10 21:53:54 [WARN] agent: Node name "a.b.c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
2018/10/10 21:53:54 [INFO] raft: Initial configuration (index=2144): [{Suffrage:Voter ID:5d21d4a7-9604-ee22-4aea-7b97418349b3 Address:10.113.6.144:8300} {Suffrage:Voter ID:15b662fb-2d1e-1efa-60fa-43af9773186
d Address:10.113.7.17:8300} {Suffrage:Voter ID:f7c36e86-9e84-8117-8462-0a7356475bf5 Address:10.113.6.108:8300}]
2018/10/10 21:53:54 [INFO] raft: Node at 10.113.6.108:8300 [Follower] entering Follower state (Leader: "")
2018/10/10 21:53:54 [INFO] serf: EventMemberJoin: a.b.c.ssc 10.113.6.108
2018/10/10 21:53:54 [INFO] serf: Attempting re-join to previously known node: a.b.dssc: 10.113.7.17:8302
2018/10/10 21:53:54 [DEBUG] memberlist: Initiating push/pull sync with: 10.113.7.17:8302
2018/10/10 21:53:54 [INFO] serf: EventMemberJoin: a.b.c 10.113.6.108
2018/10/10 21:53:54 [INFO] serf: Attempting re-join to previously known node: a.b.d.ssc: 10.113.6.144:8301
2018/10/10 21:53:54 [INFO] consul: Adding LAN server a.b.c (Addr: tcp/10.113.6.108:8300) (DC: ssc)
2018/10/10 21:53:54 [INFO] consul: Handled member-join event for server "a.b.c.ssc" in area "wan"
2018/10/10 21:53:54 [INFO] consul: Raft data found, disabling bootstrap mode
2018/10/10 21:53:54 [DEBUG] agent/proxy: managed Connect proxy manager started
2018/10/10 21:53:54 [DEBUG] memberlist: Initiating push/pull sync with: 10.113.6.144:8301
2018/10/10 21:54:09 [DEBUG] agent: Node info in sync
2018/10/10 21:54:11 [DEBUG] memberlist: Stream connection from=10.113.7.17:58602
2018/10/10 21:54:26 [DEBUG] memberlist: Stream connection from=10.113.6.144:33598
2018/10/10 21:54:40 [DEBUG] memberlist: Initiating push/pull sync with: 10.113.7.17:8301
2018/10/10 21:54:41 [DEBUG] memberlist: Stream connection from=10.113.7.17:58606
2018/10/10 21:54:56 [DEBUG] memberlist: Stream connection from=10.113.6.144:33602
2018/10/10 21:55:04 [DEBUG] memberlist: Stream connection from=10.113.6.144:34674
2018/10/10 21:55:09 [DEBUG] memberlist: Stream connection from=10.113.7.17:53974
2018/10/10 21:55:10 [DEBUG] memberlist: Initiating push/pull sync with: 10.113.6.144:8301
2018/10/10 21:55:16 [DEBUG] memberlist: Initiating push/pull sync with: 10.113.7.17:8302
2018/10/10 21:55:29 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
2018/10/10 21:55:29 [DEBUG] agent: Node info in sync
2018/10/10 21:55:40 [DEBUG] memberlist: Initiating push/pull sync with: 10.113.6.144:8301
2018/10/10 21:55:54 [DEBUG] manager: Rebalanced 3 servers, next active server is a.b.d.ssc (Addr: tcp/10.113.6.144:8300)

Output from 'consul info'

bash-4.2$ ./consul info
agent:
        check_monitors = 0
        check_ttls = 0
        checks = 0
        services = 0
build:
        prerelease =
        revision = 48d287ef
        version = 1.2.3
consul:
        bootstrap = false
        known_datacenters = 1
        leader = false
        leader_addr = 10.113.7.17:8300
        server = true
raft:
        applied_index = 2421
        commit_index = 2421
        fsm_pending = 0
        last_contact = 19.298597ms
        last_log_index = 2421
        last_log_term = 26
        last_snapshot_index = 0
        last_snapshot_term = 0
        latest_configuration = [{Suffrage:Voter ID:5d21d4a7-9604-ee22-4aea-7b97418349b3 Address:10.113.6.144:8300} {Suffrage:Voter ID:15b662fb-2d1e-1efa-60fa-43af9773186d Address:10.113.7.17:8300} {Suffrage:Voter ID:f7c36e86-9e84-8117-8462-0a7356475bf5 Address:10.113.6.108:8300}]
        latest_configuration_index = 2144
        num_peers = 2
        protocol_version = 3
        protocol_version_max = 3
        protocol_version_min = 0
        snapshot_version_max = 1
        snapshot_version_min = 0
        state = Follower
        term = 26
runtime:
        arch = amd64
        cpu_count = 16
        goroutines = 78
        max_procs = 16
        os = linux
        version = go1.10.1
serf_lan:
        coordinate_resets = 0
        encrypted = false
        event_queue = 0
        event_time = 6
        failed = 0
        health_score = 0
        intent_queue = 0
        left = 0
        member_time = 14
        members = 3
        query_queue = 0
        query_time = 1
serf_wan:
        coordinate_resets = 0
        encrypted = false
        event_queue = 0
        event_time = 1
        failed = 0
        health_score = 0
        intent_queue = 0
        left = 0
        member_time = 13
        members = 3
        query_queue = 0
        query_time = 1
@CJLove
Copy link

CJLove commented Oct 12, 2018

Seeing this as well on a couple of agents after moving to v1.2.3 in order to log to a file rather than syslog.

@pearkes
Copy link
Contributor

pearkes commented Oct 26, 2018

I'm unable to reproduce this supplying the argument via files or the CLI. As a result I'm going to close this but please feel free to comment back if you can consistently reproduce.

@pearkes pearkes closed this as completed Oct 26, 2018
@NelsonFigueira
Copy link

Hi, we are experiencing the same issue.
Consul Consul 1.2.3
Running on Linux 6.10

@trondhindenes
Copy link

I can repro. Log-level set in config file, log-file specified on command line (since it apparently doesn't have a config file equivalent)

@trondhindenes
Copy link

also repro-able if log-level is specified on command line like -log-file=/var/log/consul/ -log-level=info

@pearkes
Copy link
Contributor

pearkes commented Nov 21, 2018

@trondhindenes Can you post the complete commands you're running and log output, and any configuration?

@pearkes pearkes reopened this Nov 21, 2018
@trondhindenes
Copy link

trondhindenes commented Nov 21, 2018

ubuntu@ldm-0104056:/var/log/consul$ cat /opt/consul/config/consul_client_config.json 
{
  "datacenter": "aws-euw1",
  "data_dir": "/opt/consul/consul-1.4.0/data",
  "server": false,
  "node_name": "ldm-0104056",
  "addresses": {
    "https": "127.0.0.1"
  },
  "ports": {
  "http": 8500,
  "https": 9999
  },
  "acl_datacenter": "aws-euw1",
  "key_file": "/opt/consul/consul-1.4.0/cert/ldm-0104056.aws-euw1.sd.my.domain.ding.key-decrypted.pem",
  "cert_file": "/opt/consul/consul-1.4.0/cert/ldm-0104056.aws-euw1.sd.my.domain.ding.cert.pem",
  "ca_file": "/opt/consul/consul-1.4.0/cert/root.ca.pem",
  "verify_incoming": true, 
  "verify_outgoing": true, 
  "verify_server_hostname": true,
  "domain":"sd.my.domain.ding",
  "ui":true,
  "start_join": ["10.245.14.251","10.245.14.84","10.245.14.199"],
  "telemetry": {
     "dogstatsd_addr": "127.0.0.1:8125"
  }
}

ubuntu@ldm-0104056:/var/log/consul$ cat /etc/systemd/system/consul.service 
[Unit]
Description=consul
Requires=network-online.target
After=network-online.target

[Service]
EnvironmentFile=-/etc/sysconfig/consul
Environment=GOMAXPROCS=2
Restart=on-failure
ExecStart=/usr/local/bin/consul agent -config-dir=/opt/consul/config -data-dir=/opt/consul/consul-1.4.0/data -log-file=/var/log/consul/ -log-level=info
ExecReload=/bin/kill -HUP $MAINPID
KillSignal=SIGINT

[Install]
WantedBy=multi-user.target

Snippet from log:

2018/11/21 17:14:12 [DEBUG] memberlist: Stream connection from=10.245.12.134:47528
2018/11/21 17:14:16 [DEBUG] memberlist: Stream connection from=10.245.14.59:56712
2018/11/21 17:14:19 [DEBUG] memberlist: Stream connection from=10.245.12.229:58922
2018/11/21 17:14:20 [DEBUG] memberlist: Initiating push/pull sync with: 10.245.12.52:8301
2018/11/21 17:14:21 [DEBUG] http: Request GET /v1/catalog/services?index=5513212&wait=15000ms (15.419545824s) from=127.0.0.1:48782
2018/11/21 17:14:22 [DEBUG] http: Request GET /v1/health/state/any?index=5513383&wait=15000ms (15.585374044s) from=127.0.0.1:48780
2018/11/21 17:14:24 [DEBUG] memberlist: Failed ping: utils-consul-ui-prod-v6rhs (timeout reached)
2018/11/21 17:14:31 [DEBUG] http: Request GET /v1/health/state/any?index=5513383&wait=15000ms (9.46269355s) from=127.0.0.1:48780
2018/11/21 17:14:31 [DEBUG] http: Request GET /v1/catalog/services (43.810135ms) from=127.0.0.1:48780

@trondhindenes
Copy link

(tried log-level on the cmd line and log_level in the config, but none of them seem to have any effect)

@pearkes
Copy link
Contributor

pearkes commented Nov 21, 2018

@trondhindenes Can you post the log output in question as well?

@trondhindenes
Copy link

sure, updated. Interestingly, the journalctl output from Consul seems to respect log-level, but the logfile seems to not respect it.

@pearkes
Copy link
Contributor

pearkes commented Nov 21, 2018

@trondhindenes Ah nice catch, that is the bug. stdout is behaving as expected but the log-file option added in #4581 is not.

@pearkes pearkes added the type/bug Feature does not function as expected label Nov 21, 2018
@pearkes pearkes changed the title Always see debug level logs even with log-level="WARN" -log-file option not respecting log-level Nov 21, 2018
@pearkes
Copy link
Contributor

pearkes commented Nov 21, 2018

Renamed the issue for clarity. Thanks again @trondhindenes!

@trondhindenes
Copy link

awesome!

@chsk-code
Copy link

Hi, we are experiencing the same issue.
Consul v1.4.0
Operating System: Red Hat Enterprise Linux Server 7.6 (Maipo)
command used:
/usr/local/sbin/consul agent -config-file=/etc/consul/config.json -config-dir=/etc/consul/consul.d -pid-file=/var/run/consul/consul.pid -log-file=/var/log/consul/consul_agent -log-level=INFO

Part of Log file:
2018/12/19 03:05:45 [DEBUG] agent: Check "vault:node-01:8001:vault-sealed-check" status is now critical
2018/12/19 03:05:45 [DEBUG] agent: Service "vault:node-01:8001" in sync
2018/12/19 03:05:45 [DEBUG] agent: Check "vault:node-01:8001:vault-sealed-check" in sync
2018/12/19 03:05:45 [DEBUG] agent: Node info in sync
2018/12/19 03:05:45 [DEBUG] http: Request PUT /v1/agent/check/fail/vault:node-01:8001:vault-sealed-check?note=Vault+Sealed (283.213µs) from=10.117.19.101:38503

@aryosigit
Copy link

@pearkes Is there any estimate on when this bug will be fixed? I am working on a project that needs the amount of log entries to be reduced by applying the log-level setting to the log file.

@varnson
Copy link

varnson commented Feb 14, 2019

@pearkes when should this be fixed?

@trondhindenes
Copy link

I'd like to know that as well.

@pearkes pearkes added the good first issue A well-defined bug or improvement with sufficient context which should be approachable for new contr label Feb 25, 2019
@pearkes pearkes added this to the Upcoming milestone Feb 25, 2019
@pearkes
Copy link
Contributor

pearkes commented Feb 25, 2019

We haven't scheduled this work, I just tagged it to increase visibility and put it on our backlog to fix.

@dsouzajude
Copy link
Contributor

@pearkes I opened a PR for this here #5510. Would like if someone reviews it.

@freddygv
Copy link
Contributor

Closing this issue since it was fixed in #5510. Will be in the upcoming 1.5.0 release.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue A well-defined bug or improvement with sufficient context which should be approachable for new contr type/bug Feature does not function as expected
Projects
None yet
Development

No branches or pull requests

10 participants