Skip to content

Data received gets less and less. #45

Closed
@fsteckel

Description

@fsteckel

Hello again,

I report a bug. python version 3.7.5, pip 18.1, ubuntu 19.10 in a local docker container with unicorn binance 1.11.0.

I'm sorry, this is rather unspecific: I posted another issure before and attached a log file. There you stated, you would rather put all the streams of the symbols into one single stream in oppose to my seperating each and every stream. Then I did not rember why I seperated them in the first place. So I did unseperate them and now have two streams, one for depth, one for trade. Each containing all the symbols.

Something strange happens:
No exceptions. Everything seams ok. I collect all the message frames and pack them into sql files (one per day). I started off on the first day with a file of about 6-8GB. Second day 400MB, third 300, fourth 250MB. Less and less data is being packed into the sql files.
This (sql) system has worked before without this error, thats why I suspect the unicorn package (which otherwise is great).
Encountering this behavior in the past drove me into seperating the streams in order to check each streams
stream_info['status'] != 'running'
and also
time.time() - stream_info['last_heartbeat'] > self.stream_heartbeat_tolerance.

One (maybe) reason:
Is the 'running' flag set to running if at least one of the symbols in the stream is still being updated? Then it would yield a false sense of running. Same goes for the last heartbeat. This should better be the last heartbeat of the least updated symbol.

My suspicions may be totally off and I appologize for this rather unspecific 'bug'.
Thanks for any helf you can provide Oliver!

Metadata

Metadata

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions