You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When donig an /api/query? for which one of the tag's tagv does not exist, the following big exception stack trace appears in the log, when really it should not since it is a normal enough issue prompted by the user (rather than a problem in the opentsdb code). The WARN for the 'Bad Request' should be sufficient, with the stack trace created undesirable log noise.
(Note that /api/query/last for which one of the tag's tagv does not exist does not produce a stack trace in the logs, i.e. does handle this case gracefully.)
2021-10-20 02:47:09,927 [AsyncHBase I/O Worker #4] ERROR [QueryRpc.call] - Query exception:
com.stumbleupon.async.DeferredGroupException: At least one of the Deferreds failed, first exception:
at com.stumbleupon.async.DeferredGroup.done(DeferredGroup.java:169) [async-1.4.0.jar:na]
at com.stumbleupon.async.DeferredGroup.recordCompletion(DeferredGroup.java:142) [async-1.4.0.jar:na]
at com.stumbleupon.async.DeferredGroup.access$000(DeferredGroup.java:36) [async-1.4.0.jar:na]
at com.stumbleupon.async.DeferredGroup$1Notify.call(DeferredGroup.java:82) [async-1.4.0.jar:na]
at com.stumbleupon.async.Deferred.doCall(Deferred.java:1278) [async-1.4.0.jar:na]
at com.stumbleupon.async.Deferred.runCallbacks(Deferred.java:1257) [async-1.4.0.jar:na]
at com.stumbleupon.async.Deferred.access$300(Deferred.java:430) [async-1.4.0.jar:na]
at com.stumbleupon.async.Deferred$Continue.call(Deferred.java:1366) [async-1.4.0.jar:na]
at com.stumbleupon.async.Deferred.doCall(Deferred.java:1278) [async-1.4.0.jar:na]
at com.stumbleupon.async.Deferred.runCallbacks(Deferred.java:1257) [async-1.4.0.jar:na]
at com.stumbleupon.async.Deferred.callback(Deferred.java:1005) [async-1.4.0.jar:na]
at com.stumbleupon.async.DeferredGroup.done(DeferredGroup.java:169) [async-1.4.0.jar:na]
at com.stumbleupon.async.DeferredGroup.recordCompletion(DeferredGroup.java:142) [async-1.4.0.jar:na]
at com.stumbleupon.async.DeferredGroup.access$000(DeferredGroup.java:36) [async-1.4.0.jar:na]
at com.stumbleupon.async.DeferredGroup$1Notify.call(DeferredGroup.java:82) [async-1.4.0.jar:na]
at com.stumbleupon.async.Deferred.doCall(Deferred.java:1278) [async-1.4.0.jar:na]
at com.stumbleupon.async.Deferred.runCallbacks(Deferred.java:1257) [async-1.4.0.jar:na]
at com.stumbleupon.async.Deferred.callback(Deferred.java:1005) [async-1.4.0.jar:na]
at com.stumbleupon.async.DeferredGroup.done(DeferredGroup.java:169) [async-1.4.0.jar:na]
at com.stumbleupon.async.DeferredGroup.recordCompletion(DeferredGroup.java:142) [async-1.4.0.jar:na]
at com.stumbleupon.async.DeferredGroup.access$000(DeferredGroup.java:36) [async-1.4.0.jar:na]
at com.stumbleupon.async.DeferredGroup$1Notify.call(DeferredGroup.java:82) [async-1.4.0.jar:na]
at com.stumbleupon.async.Deferred.doCall(Deferred.java:1278) [async-1.4.0.jar:na]
at com.stumbleupon.async.Deferred.runCallbacks(Deferred.java:1257) [async-1.4.0.jar:na]
at com.stumbleupon.async.Deferred.callback(Deferred.java:1005) [async-1.4.0.jar:na]
at org.hbase.async.HBaseRpc.callback(HBaseRpc.java:720) [asynchbase-1.8.2.jar:na]
at org.hbase.async.RegionClient.decode(RegionClient.java:1575) [asynchbase-1.8.2.jar:na]
at org.hbase.async.RegionClient.decode(RegionClient.java:88) [asynchbase-1.8.2.jar:na]
at org.jboss.netty.handler.codec.replay.ReplayingDecoder.callDecode(ReplayingDecoder.java:500) [netty-3.10.6.Final.jar:na]
at org.jboss.netty.handler.codec.replay.ReplayingDecoder.messageReceived(ReplayingDecoder.java:435) [netty-3.10.6.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) [netty-3.10.6.Final.jar:na]
at org.hbase.async.RegionClient.handleUpstream(RegionClient.java:1230) [asynchbase-1.8.2.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.10.6.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [netty-3.10.6.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelHandler.messageReceived(SimpleChannelHandler.java:142) [netty-3.10.6.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelHandler.handleUpstream(SimpleChannelHandler.java:88) [netty-3.10.6.Final.jar:na]
at org.jboss.netty.handler.timeout.IdleStateAwareChannelHandler.handleUpstream(IdleStateAwareChannelHandler.java:36) [netty-3.10.6.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.10.6.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) [netty-3.10.6.Final.jar:na]
at org.jboss.netty.handler.timeout.IdleStateHandler.messageReceived(IdleStateHandler.java:294) [netty-3.10.6.Final.jar:na]
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) [netty-3.10.6.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) [netty-3.10.6.Final.jar:na]
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) [netty-3.10.6.Final.jar:na]
at org.hbase.async.HBaseClient$RegionClientPipeline.sendUpstream(HBaseClient.java:3857) [asynchbase-1.8.2.jar:na]
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) [netty-3.10.6.Final.jar:na]
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) [netty-3.10.6.Final.jar:na]
at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) [netty-3.10.6.Final.jar:na]
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) [netty-3.10.6.Final.jar:na]
at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337) [netty-3.10.6.Final.jar:na]
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) [netty-3.10.6.Final.jar:na]
at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) [netty-3.10.6.Final.jar:na]
at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) [netty-3.10.6.Final.jar:na]
at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) [netty-3.10.6.Final.jar:na]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [na:1.8.0_292]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [na:1.8.0_292]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_292]
Caused by: com.stumbleupon.async.DeferredGroupException: At least one of the Deferreds failed, first exception:
... 45 common frames omitted
Caused by: com.stumbleupon.async.DeferredGroupException: At least one of the Deferreds failed, first exception:
... 38 common frames omitted
Caused by: net.opentsdb.uid.NoSuchUniqueName: No such name for 'tagv': 'redacted'
at net.opentsdb.uid.UniqueId$1GetIdCB.call(UniqueId.java:450) ~[tsdb-2.4.1.jar:0ad603b]
at net.opentsdb.uid.UniqueId$1GetIdCB.call(UniqueId.java:447) ~[tsdb-2.4.1.jar:0ad603b]
... 34 common frames omitted
Here is a log grep to show the correspondence of this error to warnings for the associated queries.
2021-10-20 02:47:09,927 [AsyncHBase I/O Worker #4] ERROR [QueryRpc.call] - Query exception:
2021-10-20 02:47:09,930 [AsyncHBase I/O Worker #4] WARN [HttpQuery.logWarn] - [id: 0x5f0f71ba, /192.168.2.6:35720 => /192.168.16.8:4242] Bad Request on /api/query?start=redacted&end=redacted&m=mimmax:10m-avg:redacted%7Bredacted%7D: No such name for 'tagv': 'redacted'
2021-10-20 02:47:09,984 [AsyncHBase I/O Worker #4] ERROR [QueryRpc.call] - Query exception:
2021-10-20 02:47:09,985 [AsyncHBase I/O Worker #4] WARN [HttpQuery.logWarn] - [id: 0xb9353ad7, /192.168.2.6:35726 => /192.168.16.8:4242] Bad Request on /api/query?start=redacted&end=redacted&m=mimmax:10m-avg:redacted%7Bredacted%7D: No such name for 'tagv': 'redacted'
2021-10-20 02:47:10,064 [AsyncHBase I/O Worker #4] ERROR [QueryRpc.call] - Query exception:
2021-10-20 02:47:10,065 [AsyncHBase I/O Worker #4] WARN [HttpQuery.logWarn] - [id: 0x57589b0e, /192.168.2.6:35730 => /192.168.16.8:4242] Bad Request on /api/query?start=redacted&end=redacted&m=mimmax:10m-avg:redacted%7Bredacted%7D: No such name for 'tagv': 'redacted'
2021-10-20 02:47:10,123 [AsyncHBase I/O Worker #4] ERROR [QueryRpc.call] - Query exception:
2021-10-20 02:47:10,124 [AsyncHBase I/O Worker #4] WARN [HttpQuery.logWarn] - [id: 0x82c1becc, /192.168.2.6:35734 => /192.168.16.8:4242] Bad Request on /api/query?start=redacted&end=redacted&m=mimmax:10m-avg:redacted%7Bredacted%7D: No such name for 'tagv': 'redacted'
2021-10-20 02:47:10,159 [AsyncHBase I/O Worker #4] ERROR [QueryRpc.call] - Query exception:
2021-10-20 02:47:10,159 [AsyncHBase I/O Worker #4] WARN [HttpQuery.logWarn] - [id: 0x495e32d7, /192.168.2.6:35740 => /192.168.16.8:4242] Bad Request on /api/query?start=redacted&end=redacted&m=mimmax:30m-max:redacted%7Bredacted%7D: No such name for 'tagv': 'redacted'
2021-10-20 02:47:10,202 [AsyncHBase I/O Worker #4] ERROR [QueryRpc.call] - Query exception:
2021-10-20 02:47:10,205 [AsyncHBase I/O Worker #4] WARN [HttpQuery.logWarn] - [id: 0xaa5f9542, /192.168.2.6:35748 => /192.168.16.8:4242] Bad Request on /api/query?start=redacted&end=redacted&m=mimmax:30m-avg:redacted%7Bredacted%7D: No such name for 'tagv': 'redacted'
2021-10-20 02:47:10,231 [AsyncHBase I/O Worker #4] ERROR [QueryRpc.call] - Query exception:
2021-10-20 02:47:10,232 [AsyncHBase I/O Worker #4] WARN [HttpQuery.logWarn] - [id: 0x24f5bdd0, /192.168.2.6:35752 => /192.168.16.8:4242] Bad Request on /api/query?start=redacted&end=redacted&m=mimmax:30m-avg:redacted%7Bredacted%7D: No such name for 'tagv': 'redacted'
2021-10-20 02:47:10,296 [AsyncHBase I/O Worker #4] ERROR [QueryRpc.call] - Query exception:
2021-10-20 02:47:10,297 [AsyncHBase I/O Worker #4] WARN [HttpQuery.logWarn] - [id: 0xfa57e08d, /192.168.2.6:35756 => /192.168.16.8:4242] Bad Request on /api/query?start=redacted&end=redacted&m=mimmax:10m-avg:redacted%7Bredacted%7D: No such name for 'tagv': 'redacted'
2021-10-20 02:47:13,803 [AsyncHBase I/O Worker #4] ERROR [QueryRpc.call] - Query exception:
2021-10-20 02:47:13,804 [AsyncHBase I/O Worker #4] WARN [HttpQuery.logWarn] - [id: 0x6174864c, /192.168.2.6:35872 => /192.168.16.8:4242] Bad Request on /api/query?start=redacted&end=redacted&m=mimmax:10m-avg:redacted%7Bredacted%7D: No such name for 'tagv': 'redacted'
2021-10-20 02:47:13,854 [AsyncHBase I/O Worker #4] ERROR [QueryRpc.call] - Query exception:
2021-10-20 02:47:13,854 [AsyncHBase I/O Worker #4] WARN [HttpQuery.logWarn] - [id: 0x3e4b1e20, /192.168.2.6:35876 => /192.168.16.8:4242] Bad Request on /api/query?start=redacted&end=redacted&m=mimmax:10m-avg:redacted%7Bredacted%7D: No such name for 'tagv': 'redacted'
2021-10-20 02:47:13,878 [AsyncHBase I/O Worker #4] ERROR [QueryRpc.call] - Query exception:
2021-10-20 02:47:13,878 [AsyncHBase I/O Worker #4] WARN [HttpQuery.logWarn] - [id: 0x66e94cb1, /192.168.2.6:35880 => /192.168.16.8:4242] Bad Request on /api/query?start=redacted&end=redacted&m=mimmax:10m-avg:redacted%7Bredacted%7D: No such name for 'tagv': 'redacted'
2021-10-20 02:47:13,901 [AsyncHBase I/O Worker #4] ERROR [QueryRpc.call] - Query exception:
2021-10-20 02:47:13,901 [AsyncHBase I/O Worker #4] WARN [HttpQuery.logWarn] - [id: 0x7d1a7d8b, /192.168.2.6:35884 => /192.168.16.8:4242] Bad Request on /api/query?start=redacted&end=redacted&m=mimmax:10m-avg:redacted%7Bredacted%7D: No such name for 'tagv': 'redacted'
The text was updated successfully, but these errors were encountered:
When donig an /api/query? for which one of the tag's tagv does not exist, the following big exception stack trace appears in the log, when really it should not since it is a normal enough issue prompted by the user (rather than a problem in the opentsdb code). The WARN for the 'Bad Request' should be sufficient, with the stack trace created undesirable log noise.
(Note that /api/query/last for which one of the tag's tagv does not exist does not produce a stack trace in the logs, i.e. does handle this case gracefully.)
Here is a log grep to show the correspondence of this error to warnings for the associated queries.
The text was updated successfully, but these errors were encountered: