Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when logging slow process in aiokafka with python 3.9 #153

Closed
2 tasks done
taybin opened this issue May 17, 2021 · 9 comments
Closed
2 tasks done

Error when logging slow process in aiokafka with python 3.9 #153

taybin opened this issue May 17, 2021 · 9 comments

Comments

@taybin
Copy link
Contributor

taybin commented May 17, 2021

After upgrading to python 3.9, I started getting these crashes in the worker. They appear to happen when trying to log a warning about a slow commit to Kafka.

Checklist

  • I have included information about relevant versions
  • I have verified that the issue persists when using the master branch of Faust.

Full traceback

Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/mode/services.py", line 802, in _execute_task
await task
File "/usr/local/lib/python3.9/site-packages/faust/transport/consumer.py", line 851, in _commit_livelock_detector
await self.app.loop.run_in_executor(
File "/usr/local/lib/python3.9/concurrent/futures/thread.py", line 52, in run
result = self.fn(*self.args, **self.kwargs)
File "/usr/local/lib/python3.9/site-packages/faust/transport/consumer.py", line 859, in verify_all_partitions_active
self.verify_event_path(now, tp)
File "/usr/local/lib/python3.9/site-packages/faust/transport/drivers/aiokafka.py", line 279, in verify_event_path
return self._thread.verify_event_path(now, tp)
File "/usr/local/lib/python3.9/site-packages/faust/transport/drivers/aiokafka.py", line 772, in verify_event_path
self._log_slow_processing_commit(
File "/usr/local/lib/python3.9/site-packages/faust/transport/drivers/aiokafka.py", line 847, in _log_slow_processing_commit
self._log_slow_processing(
File "/usr/local/lib/python3.9/site-packages/faust/transport/drivers/aiokafka.py", line 872, in _log_slow_processing
return self.log.error(
File "/usr/local/lib/python3.9/site-packages/mode/utils/logging.py", line 219, in error
self.log(logging.ERROR, message, *args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/mode/utils/logging.py", line 287, in log
self.logger.log(severity,
File "/usr/local/lib/python3.9/logging/__init__.py", line 1512, in log
self._log(level, msg, args, **kwargs)
TypeError: _log() got an unexpected keyword argument 'setting'

Versions

  • Python 3.9
  • Faust-Streaming 0.6.5
  • Docker image python:3.9
  • Kafka version
  • RocksDB version (if applicable)
@taybin
Copy link
Contributor Author

taybin commented May 17, 2021

Actually it happens with python 3.8 as well. Downgrading back to faust-streaming 0.6.4 to see if that makes it go away. It didn't show up until I switched to 0.6.5.

@pwilczynskiclearcode
Copy link
Contributor

https://docs.python.org/3/library/logging.html#logging.Logger.debug

There are four keyword arguments in kwargs which are inspected: exc_info, stack_info, stacklevel and extra.

The issue seems to be there for a while:
https://github.com/faust-streaming/faust/blame/master/faust/transport/drivers/aiokafka.py#L872-L877

@jerynmathew
Copy link

Actually it happens with python 3.8 as well. Downgrading back to faust-streaming 0.6.4 to see if that makes it go away. It didn't show up until I switched to 0.6.5.

Hey @taybin Did this solve your issue. Im facing the same issue when running it against Python 3.6 as well.

@taybin
Copy link
Contributor Author

taybin commented May 19, 2021

@jerynmathew It did, although I don't consider it a true "fix". At least, it hasn't reoccurred with 0.6.4.

@forsberg
Copy link
Contributor

forsberg commented May 20, 2021

Also hitting this on 0.6.5, and it seems reverting to 0.6.4 made it disappear.

Such an irony when a warning for slow processing makes the whole process die making it even slower.

As noted above, the bug seems to have been there for quite some while, but it's very interesting that 0.6.5 seems to trigger it - some kind of performance regression?

@taybin
Copy link
Contributor Author

taybin commented May 20, 2021

@forsberg I see a couple possibilities. 0.6.5 allows orjson>=3.0, so there could be a slowdown there. It also adds replication of the leader topic, but we don't write to that topic so I'm not sure how that would cause an actual slowdown.

I'll do a test with 0.6.5, but with an older version of orjson and see if that still replicates the problem.

@forsberg
Copy link
Contributor

@taybin As far as I know, I'm not doing any json serialization/deserialization in my code, so orjson seems an unlikely suspect in my case. I have a table, but using avro for serde of table values.

Leader topic is disabled in my code as I have no use for it.

forsberg added a commit to forsberg/faust-streaming that referenced this issue May 21, 2021
forsberg added a commit to forsberg/faust-streaming that referenced this issue May 21, 2021
forsberg added a commit to forsberg/faust-streaming that referenced this issue May 21, 2021
forsberg added a commit to forsberg/faust-streaming that referenced this issue May 21, 2021
@rtrive
Copy link

rtrive commented May 27, 2021

I had the same problem with faust 0.6.5 and python 3.8

patkivikram added a commit that referenced this issue May 28, 2021
Fixes crash in #153

Co-authored-by: Vikram Patki <54442035+patkivikram@users.noreply.github.com>
@taybin
Copy link
Contributor Author

taybin commented Jun 4, 2021

This issue has been fixed by #156, although there seems to be a performance regression or at least detection of a performance issue is either switched back on or the threshold was lowered somehow.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants