Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

failed: KafkaApiSpec #215

Closed
johanandren opened this issue Oct 20, 2016 · 3 comments · Fixed by #219
Closed

failed: KafkaApiSpec #215

johanandren opened this issue Oct 20, 2016 · 3 comments · Fixed by #219

Comments

@johanandren
Copy link
Contributor

https://travis-ci.org/lagom/lagom/jobs/169197419 (and also in some PR validation runs)

com.lightbend.lagom.internal.broker.kafka.KafkaApiSpec

Not sure if any of these are related to the actual failure:

some ask timeouts

akka.pattern.AskTimeoutException: Ask timed out on [Actor[akka://application/system/kafka-consumer-3#-352299927]] after [15000 ms]. Sender[null] sent message of type "akka.kafka.KafkaConsumerActor$Internal$Commit".
    at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:604)
    at akka.actor.Scheduler$$anon$4.run(Scheduler.scala:126)
    at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:601)
    at scala.concurrent.BatchingExecutor$class.execute(BatchingExecutor.scala:109)
    at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:599)
    at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(LightArrayRevolverScheduler.scala:331)
    at akka.actor.LightArrayRevolverScheduler$$anon$4.executeBucket$1(LightArrayRevolverScheduler.scala:282)
    at akka.actor.LightArrayRevolverScheduler$$anon$4.nextTick(LightArrayRevolverScheduler.scala:286)
    at akka.actor.LightArrayRevolverScheduler$$anon$4.run(LightArrayRevolverScheduler.scala:238)
    at java.lang.Thread.run(Thread.java:745)

mbean weirdness:

[warn] o.a.k.c.u.AppInfoParser - Error registering AppInfo mbean
javax.management.InstanceAlreadyExistsException: kafka.consumer:type=app-info,id=testservice-3
    at com.sun.jmx.mbeanserver.Repository.addMBean(Repository.java:437)
    at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerWithRepository(DefaultMBeanServerInterceptor.java:1898)
    at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerDynamicMBean(DefaultMBeanServerInterceptor.java:966)
    at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerObject(DefaultMBeanServerInterceptor.java:900)
    at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerMBean(DefaultMBeanServerInterceptor.java:324)
    at com.sun.jmx.mbeanserver.JmxMBeanServer.registerMBean(JmxMBeanServer.java:522)
    at org.apache.kafka.common.utils.AppInfoParser.registerAppInfo(AppInfoParser.java:58)
    at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:694)
    at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:587)
    at akka.kafka.ConsumerSettings.createKafkaConsumer(ConsumerSettings.scala:308)

Kafka warnings about leader:

[warn] o.a.k.c.NetworkClient - Error while fetching metadata with correlation id 0 : {test1=LEADER_NOT_AVAILABLE}
[warn] o.a.k.c.NetworkClient - Error while fetching metadata with correlation id 1 : {test1=LEADER_NOT_AVAILABLE}
[warn] o.a.k.c.NetworkClient - Error while fetching metadata with correlation id 1 : {test1=LEADER_NOT_AVAILABLE}
@jroper
Copy link
Member

jroper commented Oct 21, 2016

Yeah, we've been trying to fix that for a while. I don't know what's going on.

I've never been able to reproduce a single of these failures locally, they only happen on Travis.

@TimMoore
Copy link
Contributor

I think I'll try setting up a VirtualBox VM with specs similar to Travis to see if that helps to reproduce it.

@TimMoore
Copy link
Contributor

For reference, here are some examples of the failure:

(There are many more that were later retried, so we don't have the original logs anymore.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging a pull request may close this issue.

3 participants