Skip to content
This repository has been archived by the owner on Jan 27, 2024. It is now read-only.

Pidgin crashes if there are too many messages to synchronize (only win32) #220

Open
TomTheDragon opened this issue Jan 7, 2016 · 19 comments

Comments

@TomTheDragon
Copy link

It gives me a runtime error when there are too many messages to synchronize. This only happens in the Windows version of the plugin, the Linux one handles this without any problems.

@BenWiederhake
Copy link
Collaborator

Can you post debug logs of the crash happening? I'm sure Eion would appreciate that.

You may want to send them directly to him, in case it contains "sensitive information".

Btw: We do test this case specifically, so this is news to us.

@TomTheDragon
Copy link
Author

Thats some of the last lines i see right before the crash, if i start Pidgin in debug mode:


(21:37:17) prpl-telegram: sending all pending recipes
(21:37:17) prpl-telegram: tgl_do_mark_read (50101506)
(21:37:17) prpl-telegram: Sent query #6237518456976028800 of size 28 to DC 2
(21:37:17) prpl-telegram: sending all pending recipes
(21:37:17) prpl-telegram: tgl_do_mark_read (50101506)
(21:37:17) prpl-telegram: Sent query #6237518456976028804 of size 28 to DC 2
(21:37:17) prpl-telegram: Sent query #6237518456976028808 of size 8 to DC 2
(21:37:17) dnsquery: Performing DNS lookup for 149.154.167.91
(21:37:17) prpl-telegram: Sent query #6237518459986720128 of size 32 to DC 4
(21:37:17) prpl-telegram: Sent query #6237518459986720132 of size 32 to DC 4
(21:37:17) prpl-telegram: Sent query #6237518456976028812 of size 8 to DC 2
(21:37:17) prpl-telegram: Sent query #6237518456976028816 of size 36 to DC 2
(21:37:17) prpl-telegram: Sent query #6237518456976028820 of size 36 to DC 2
(21:37:17) prpl-telegram: Sent query #6237518456976028824 of size 36 to DC 2
(21:37:17) prpl-telegram: Sent query #6237518456976028828 of size 36 to DC 2
(21:37:17) prpl-telegram: Sent query #6237518456976028832 of size 36 to DC 2
(21:37:17) prpl-telegram: Sent query #6237518456976028836 of size 36 to DC 2
(21:37:17) prpl-telegram: Sent query #6237518456976028840 of size 36 to DC 2
(21:37:17) prpl-telegram: Sent query #6237518456976028844 of size 36 to DC 2
(21:37:17) prpl-telegram: Sent query #6237518456976028848 of size 36 to DC 2
(21:37:17) prpl-telegram: Sent query #6237518456976028852 of size 36 to DC 2
(21:37:17) prpl-telegram: Sent query #6237518456976028856 of size 36 to DC 2
(21:37:17) dnsquery: Performing DNS lookup for 149.154.175.50
(21:37:17) prpl-telegram: Sent query #6237518459986720128 of size 32 to DC 1
(21:37:17) prpl-telegram: Sent query #6237518459986720136 of size 32 to DC 4
(21:37:17) prpl-telegram: Sent query #6237518459986720132 of size 32 to DC 1
Assertion failed!

Program: D:\Pidgin\pidgin.exe
File: queries.c, Line 119

Expression: c


It dont creates a entrys in the pidgin.RPT file.

@EionRobb
Copy link
Contributor

EionRobb commented Jan 8, 2016

The pseudo-backtrace for this is

tree.h:74      tree_insert_query
queries.c:249  tglq_send_query_ex
...

and it looks like its asserting because the query it's trying to put in the TLS->queries_tree is identical to an existing query in the queries_tree

@EionRobb
Copy link
Contributor

EionRobb commented Jan 8, 2016

@TomTheDragon can you get a full debug log from connection start up until the assert() happens, what we should see is something like prpl-telegram: Sent query #{ABC..} of size 32 to DC {X} appear twice with the same {ABC...} and the same {X}

@EionRobb
Copy link
Contributor

EionRobb commented Jan 9, 2016

Ok, looking through the code, this looks like a 32-bit vs 64-bit problem.

struct query is going to be arch-specific in size (https://github.com/majn/tgl/blob/ec0b19630f88ebe3b68e855f49ef134f0e98e943/queries.h#L40) but the tree comparison is only looking at the first 8 bytes of the struct for uniqueness (https://github.com/majn/tgl/blob/ec0b19630f88ebe3b68e855f49ef134f0e98e943/queries.c#L118-L119). I'm not sure what the correct fix is here, whether the struct query should be changed to use arch-independent sizes for data_len/flags/seq_no etc (actually probably can't because of the void*data, or whether the memcmp() should be changed to use sizeof(struct query)

@EionRobb
Copy link
Contributor

EionRobb commented Jan 9, 2016

@TomTheDragon can you try with this dll, which has BenWiederhake/tgl@87abdf0 added to try and fix your crash

@TomTheDragon
Copy link
Author

(repost from the telegram chat)
i see something like this for minutes now:


(04:57:24) prpl-telegram: Resent query #6238002925376460184 as #6238002951143444744 of size 68 to DC 2
(04:57:24) prpl-telegram: Alarm query 6238000298997585376 (type 'get config')
(04:57:24) prpl-telegram: No such query
(04:57:25) prpl-telegram: Alarm query 6238000298997585376 (type 'get config')
(04:57:25) prpl-telegram: No such query
(04:57:26) prpl-telegram: Alarm query 6238000298997585376 (type 'get config')
(04:57:26) prpl-telegram: No such query
(04:57:27) prpl-telegram: Alarm query 6238000298997585376 (type 'get config')
(04:57:27) prpl-telegram: No such query
(04:57:28) prpl-telegram: Alarm query 6238000298997585376 (type 'get config')
(04:57:28) prpl-telegram: No such query
(04:57:29) prpl-telegram: Alarm query 6238000298997585376 (type 'get config')
(04:57:29) prpl-telegram: No such query
(04:57:30) prpl-telegram: Alarm query 6238002951143444744 (type 'get difference')
(04:57:30) prpl-telegram: Resent query #6238002951143444744 as #6238002976914693776 of size 68 to DC 2


it also dont writes any changes to the "state" file

@BenWiederhake
Copy link
Collaborator

@EionRobb: AFAIR I sent you a patch that uses only a single, global counter in tgl_state instead of several ones in each tgl_session. I'm currently waiting on you to build a DLL and send it to Tom. Is this going along? Is it working?

@EionRobb
Copy link
Contributor

libtelegram.zip Apologies for the hold up. This build has the patches from https://github.com/BenWiederhake/telegram-purple/tree/tmp/for-eion

@TomTheDragon
Copy link
Author

OK so i did two tests with the new test build and i did use a 10 days old state file.

Test 1:
It did run much longer then before but it crashed after it had synchronized around 1-2 days.
so i just started it again because i was hoping that it maybe continues on the same spot, it did crash before with synchronizing.
It looks like it jumped over to the next day but it did seem to synchronize much more after that (around 5 days) but the newest logs were missing.
I did try wait and restart it multiple times but it dont wanted to get the newest logs.

Test 2:
So i restored the old 10 days old state file and removed the logs (without removing the downloaded files) and it did crash on the same point as before.
It also jumped again over to the next day if i did restart Pidgin but this time, it did get all the logs to the newest ones.

Here is the error message from the two crashes (only the some of the last lines):


(20:09:41) proxy: Connected to 149.154.167.91:443.
(20:09:41) prpl-telegram: outbound rpc connection from dc #4 becomed ready
(20:09:41) prpl-telegram: Sent query #6241577821256865772 of size 56 to DC 4
(20:09:41) prpl-telegram: update_user_handler() flags: CREATED PHOTO NAME ACCESS_HASH USERNAME
(20:09:41) prpl-telegram: update_user_handler() flags: CREATED PHOTO NAME ACCESS_HASH USERNAME
(20:09:41) prpl-telegram: update 0x4e90bfd6 (check=-1)
(20:09:41) prpl-telegram: update 0x4e90bfd6 (check=-1)
(20:09:41) prpl-telegram: update 0x4e90bfd6 (check=-1)
(20:09:41) prpl-telegram: update 0x4e90bfd6 (check=-1)
(20:09:41) prpl-telegram: update 0x4e90bfd6 (check=-1)
(20:09:41) prpl-telegram: update 0x4e90bfd6 (check=-1)
(20:09:41) prpl-telegram: update 0x4e90bfd6 (check=-1)
(20:09:41) prpl-telegram: update 0x4e90bfd6 (check=-1)
(20:09:41) prpl-telegram: update 0x4e90bfd6 (check=-1)
(20:09:41) prpl-telegram: update 0x4e90bfd6 (check=-1)
(20:09:41) prpl-telegram: update 0x4e90bfd6 (check=-1)
(20:09:41) prpl-telegram: update 0x9961fd5c (check=-1)
(20:09:41) prpl-telegram: update 0x9961fd5c (check=-1)
(20:09:41) prpl-telegram: update 0x9961fd5c (check=-1)
(20:09:41) prpl-telegram: update 0x9961fd5c (check=-1)
(20:09:41) prpl-telegram: update 0x9961fd5c (check=-1)
(20:09:41) prpl-telegram: update 0x9961fd5c (check=-1)
(20:09:41) prpl-telegram: update 0x9961fd5c (check=-1)
(20:09:41) prpl-telegram: update 0x2f2f21bf (check=-1)
(20:09:41) prpl-telegram: update 0x2f2f21bf (check=-1)
(20:09:41) prpl-telegram: update 0x2f2f21bf (check=-1)
(20:09:41) prpl-telegram: update 0x2f2f21bf (check=-1)
(20:09:41) prpl-telegram: update 0x2f2f21bf (check=-1)
(20:09:41) prpl-telegram: update 0x2f2f21bf (check=-1)
(20:09:41) prpl-telegram: update 0x2f2f21bf (check=-1)
(20:09:41) prpl-telegram: update 0x68c13933 (check=-1)
(20:09:41) prpl-telegram: Already downloaded
(20:09:41) prpl-telegram: tgp_msg_on_loaded_document()
(20:09:41) prpl-telegram: Already downloaded
(20:09:41) prpl-telegram: tgp_msg_on_loaded_document()
(20:09:41) prpl-telegram: Already downloaded
(20:09:41) prpl-telegram: tgp_msg_on_loaded_document()
(20:09:41) prpl-telegram: Already downloaded
(20:09:41) prpl-telegram: tgp_msg_on_loaded_document()
(20:09:41) prpl-telegram: Already downloaded
(20:09:41) prpl-telegram: tgp_msg_on_loaded_document()
(20:09:41) prpl-telegram: Already downloaded
(20:09:41) prpl-telegram: tgp_msg_on_loaded_document()
(20:09:41) prpl-telegram: Already downloaded
(20:09:41) prpl-telegram: tgp_msg_on_loaded_document()
(20:09:41) prpl-telegram: Already downloaded
(20:09:41) prpl-telegram: tgp_msg_on_loaded_document()
(20:09:41) prpl-telegram: Already downloaded
(20:09:41) prpl-telegram: tgp_msg_on_loaded_document()
(20:09:41) prpl-telegram: Already downloaded
(20:09:41) prpl-telegram: tgp_msg_on_loaded_document()
(20:09:41) prpl-telegram: Already downloaded
(20:09:41) prpl-telegram: tgp_msg_on_loaded_document()
(20:09:41) prpl-telegram: Already downloaded
(20:09:41) prpl-telegram: tgp_msg_on_loaded_document()
(20:09:41) prpl-telegram: Already downloaded
(20:09:41) prpl-telegram: tgp_msg_on_loaded_document()
(20:09:41) prpl-telegram: Already downloaded
(20:09:41) prpl-telegram: tgp_msg_on_loaded_document()
(20:09:41) prpl-telegram: Already downloaded
(20:09:41) prpl-telegram: tgp_msg_on_loaded_document()
(20:09:41) prpl-telegram: Already downloaded
(20:09:41) prpl-telegram: tgp_msg_on_loaded_document()
(20:09:41) prpl-telegram: Already downloaded
(20:09:41) prpl-telegram: tgp_msg_on_loaded_document()
(20:09:41) prpl-telegram: Sent query #6241577821256865776 of size 68 to DC 2
(20:09:41) prpl-telegram: restarting query 6241577821256865772
(20:09:41) prpl-telegram: Alarm query 6241577821256865772 (type 'get config')
(20:09:41) prpl-telegram: wrote state file: wpts=26873 wqts=0 wseq=2557 wdate=1453230577
(20:09:41) prpl-telegram: wrote secret chat file: 0 chats written.
(20:09:41) prpl-telegram: work_new_session_created: msg_id = 6241577814718183425, dc = 4
(20:09:41) prpl-telegram: regen query from old session 6241577808057671064
(20:09:41) prpl-telegram: Sent query #6241577821256865784 of size 36 to DC 2
(20:09:41) prpl-telegram: Alarm query 6241577808057671064 (type 'download part')

(20:09:41) prpl-telegram: Resent query #6241577808057671064 as #6241577821256865788 of size 32 to DC 4
(20:09:41) prpl-telegram: error for query 'download part' #6241577821256865788:#400 OFFSET_INVALID
(20:09:41) prpl-telegram: tgp_msg_on_loaded_document()
Assertion failed!

Program: D:\Pidgin\pidgin.exe
File: tgp-msg.c, Line 598

Expression: success


@BenWiederhake
Copy link
Collaborator

Sorry, it seems this nearly petered out.

It looks like it now fails to download a document, which seems to be another, unrelated bug.

I fear that the bug is another incarnation of EionRobb/tgl@958aeec .
Can you somehow deduce what document that is? It could be any file, big image, gif, audio, video. It would greatly help it you could find out whether this file is < 4KiB, < 16 KiB (default buffer size of many things in tgl), or even bigger.

@BenWiederhake
Copy link
Collaborator

New report from @radasbona / Markus M. via dev chat on the same "Expression: c" thingy we originally had. So the issue seems to be an issue for more than one person.

@TomTheDragon -- poke! I'd really like to hear what that second issue is.

@TomTheDragon
Copy link
Author

Give me some time, i dont have an old state file atm so i just made one today. I will test it in some days again, when i got enough new massages.

@BenWiederhake
Copy link
Collaborator

Take your time :)

@TomTheDragon
Copy link
Author

I did a test again today but i was not able to reproduce the crash, it did sync up the logs for all the 2,5 weeks and did download 97 files without any problems.
OK it did loose the connection once but it keept going after reconnecting but i am guessing this can happen if there is just too much going on.

...
(18:24:00) prpl-telegram: Sent query #6261960271417286172 of size 20 to DC 2
(18:24:00) prpl-telegram: sending all pending recipes
(18:24:00) prpl-telegram: tgl_do_mark_read (50101506)
(18:24:00) prpl-telegram: Sent query #6261960271417286176 of size 28 to DC 2
(18:24:00) prpl-telegram: ping alarm
(18:24:00) prpl-telegram: update 0x1bfbd823 (check=0)
(18:24:02) prpl-telegram: update 0x1bfbd823 (check=0)
(18:24:02) prpl-telegram: 8835403: when=1457976243
(18:24:02) prpl-telegram: mobile
(18:24:09) prpl-telegram: ping alarm
(18:24:09) prpl-telegram: ping alarm
(18:24:14) prpl-telegram: update 0x1bfbd823 (check=0)
(18:24:14) prpl-telegram: 88175141: when=1457976255
(18:24:14) prpl-telegram: mobile
(18:24:15) prpl-telegram: ping alarm
(18:24:21) prpl-telegram: update 0x1bfbd823 (check=0)
(18:24:21) prpl-telegram: tgprpl_blist_node_menu()
(18:24:23) prpl-telegram: tgprpl_blist_node_menu()
(18:24:24) prpl-telegram: ping alarm
(18:24:24) prpl-telegram: ping alarm
(18:24:24) prpl-telegram: tgprpl_blist_node_menu()
(18:24:25) prpl-telegram: tgprpl_blist_node_menu()
(18:24:26) prpl-telegram: tgprpl_blist_node_menu()
(18:24:27) prpl-telegram: tgprpl_blist_node_menu()
(18:24:28) prpl-telegram: tgprpl_blist_node_menu()
(18:24:28) prpl-telegram: tgprpl_blist_node_menu()
(18:24:29) prpl-telegram: tgprpl_blist_node_menu()
(18:24:30) prpl-telegram: tgprpl_blist_node_menu()
(18:24:30) prpl-telegram: ping alarm
(18:24:30) prpl-telegram: tgprpl_blist_node_menu()
(18:24:31) prpl-telegram: tgprpl_blist_node_menu()
(18:24:32) prpl-telegram: tgprpl_blist_node_menu()
(18:24:32) prpl-telegram: update 0x1bfbd823 (check=0)
(18:24:39) prpl-telegram: ping alarm
(18:24:39) prpl-telegram: ping alarm
(18:24:39) prpl-telegram: update 0x1bfbd823 (check=0)
(18:24:45) prpl-telegram: ping alarm
(18:24:49) prpl-telegram: update 0x1bfbd823 (check=0)
(18:24:49) prpl-telegram: 167801027: when=1457976290
(18:24:49) prpl-telegram: mobile
(18:24:54) prpl-telegram: ping alarm
(18:24:54) prpl-telegram: ping alarm
(18:25:00) prpl-telegram: ping alarm
(18:25:03) account: Disconnecting account +4912345678901 (00509FA0)
(18:25:03) connection: Disconnecting connection 0E720A50
(18:25:03) prpl-telegram: tgprpl_close()
...

But i will make a backup of the newest state file so that i can reproduce any crash if one is happening again.

@TomTheDragon
Copy link
Author

The bug came back since i updated from v1.2.4 to v1.2.6. It did work yesterday but it always crashes now, since there are much more messages to synchronise now.

Here is the log:


(14:46:25) prefs: /pidgin/conversations/toolbar/wide changed, scheduling save.
(14:46:25) log: Failed to open log file "D:\Pidgin\data.purple\logs\XXXXX.log"for reading: No such file or directory
(14:46:25) log: Failed to open log file "D:\Pidgin\data.purple\logs\XXXXX.log"for reading: No such file or directory
(14:46:25) prpl-telegram: sending all pending recipes
(14:46:25) prpl-telegram: tgl_do_mark_read (50101506)
(14:46:25) prpl-telegram: Sent query #6271908035904412804 of size 28 to DC 2
(14:46:25) prpl-telegram: Sent query #6271908035904412808 of size 8 to DC 2
(14:46:25) prpl-telegram: Sent query #6271908035904412812 of size 8 to DC 2
(14:46:25) prpl-telegram: Sent query #6271908035904412816 of size 8 to DC 2
(14:46:25) prpl-telegram: Already downloaded
(14:46:25) prpl-telegram: tgp_msg_on_loaded_document()
(14:46:25) prpl-telegram: Already downloaded
(14:46:25) prpl-telegram: tgp_msg_on_loaded_document()
(14:46:25) prpl-telegram: Sent query #6271908035904412820 of size 32 to DC 2
(14:46:25) prpl-telegram: Sent query #6271908035904412824 of size 32 to DC 2
(14:46:25) prpl-telegram: Already downloaded
(14:46:25) prpl-telegram: tgp_msg_on_loaded_document()
(14:46:25) prpl-telegram: Already downloaded
(14:46:25) prpl-telegram: tgp_msg_on_loaded_document()
(14:46:25) prpl-telegram: Already downloaded
(14:46:25) prpl-telegram: tgp_msg_on_loaded_document()
(14:46:25) prpl-telegram: Already downloaded
(14:46:25) prpl-telegram: tgp_msg_on_loaded_document()
(14:46:25) prpl-telegram: Sent query #6271908035904412828 of size 36 to DC 2
(14:46:25) dnsquery: Performing DNS lookup for 149.154.175.100
(14:46:25) prpl-telegram: Sent query #6271908039575984064 of size 32 to DC 3
(14:46:25) prpl-telegram: Sent query #6271908039575984068 of size 32 to DC 3
(14:46:25) dnsquery: Performing DNS lookup for 91.108.56.127
(14:46:25) prpl-telegram: Sent query #6271908039575984064 of size 32 to DC 5
(14:46:25) prpl-telegram: Already downloaded
(14:46:25) prpl-telegram: tgp_msg_on_loaded_document()
(14:46:25) prpl-telegram: Sent query #6271908039575984068 of size 32 to DC 5
Assertion failed!

Program: D:\Pidgin\pidgin.exe
File: queries.c, Line 119

Expression: c


No entry in the pidgin.RPT file was created.

@TomTheDragon
Copy link
Author

This problem seems to get more and more ridicules. It now crashes at least once when i try to launch Pidgin every day and when i restart (sometimes multiple times in a row), all the last messages are gone. So i always have to make a backup of the state file before starting Pidgin, so that i can copy it back, to get all the old messages.
This has gone so far that i have stopped using it because its simply not usable at this point.

@TomTheDragon
Copy link
Author

A little report after some time:
The problem did maybe get "a little bit" better with v1.3.0r2 but its still very much present.
Its really rare that Pidgin does not crash at the first start and i normally have to restart it multiple times to get it running.
Not sure but it maybe has to do with how the plugin handles the avatar downloading, because it crashes instantly when i turn on the detail view in the buddy list and it keeps on crashing many times after that.

@dequis
Copy link
Contributor

dequis commented Nov 12, 2016

I got this in bitlbee/linux, and the error matches the original bug. However, I'm using a very old git version (65105dd, 10 months old) and the assert that was failing here failing is gone from current versions.

DEBUG prpl-telegram: Already downloaded
DEBUG prpl-telegram: tgp_msg_on_loaded_document()
DEBUG prpl-telegram: Sent query #6351867698168385536 of size 32 to DC 4
DEBUG prpl-telegram: Sent query #6351867698171148288 of size 32 to DC 4
DEBUG prpl-telegram: Sent query #6351867698173499392 of size 32 to DC 5
DEBUG prpl-telegram: Sent query #6351867698174756864 of size 32 to DC 5
DEBUG prpl-telegram: Sent query #6351867698175475712 of size 32 to DC 4
DEBUG prpl-telegram: Sent query #6351867698177325056 of size 32 to DC 4
DEBUG prpl-telegram: Sent query #6351867698179852288 of size 32 to DC 4
DEBUG prpl-telegram: Already downloaded
DEBUG prpl-telegram: tgp_msg_on_loaded_document()
DEBUG prpl-telegram: Sent query #6351867697164258304 of size 36 to DC 1
DEBUG prpl-telegram: Sent query #6351867698187645952 of size 32 to DC 5
DEBUG prpl-telegram: Sent query #6351867698189336576 of size 36 to DC 4
DEBUG prpl-telegram: Already downloaded
DEBUG prpl-telegram: tgp_msg_on_loaded_document()
DEBUG prpl-telegram: Sent query #6351867698195780608 of size 36 to DC 4
DEBUG prpl-telegram: Sent query #6351867697178409984 of size 32 to DC 1
DEBUG prpl-telegram: Sent query #6351867698205806592 of size 36 to DC 4
DEBUG prpl-telegram: Sent query #6351867697195151360 of size 128 to DC 1
DEBUG prpl-telegram: wrote state file: wpts=61815 wqts=0 wseq=527 wdate=1478909440
DEBUG prpl-telegram: wrote secret chat file: 0 chats written.
DEBUG prpl-telegram: Sent query #6351867699724338176 of size 36 to DC 4
DEBUG prpl-telegram: Alarm query 6351867614130753536 (type 'get config')
DEBUG prpl-telegram: Sent query #6351867699262371840 of size 36 to DC 1
DEBUG prpl-telegram: Sent query #6351867701209387008 of size 36 to DC 4
DEBUG prpl-telegram: Sent query #6351867700954826752 of size 36 to DC 1
DEBUG prpl-telegram: tgp_msg_on_loaded_document()
DEBUG prpl-telegram: Sent query #6351867702724263936 of size 36 to DC 1
DEBUG prpl-telegram: Alarm query 6351867614130753536 (type 'get config')
DEBUG prpl-telegram: Sent query #6351867703807982592 of size 36 to DC 1
DEBUG prpl-telegram: Sent query #6351867705204518912 of size 36 to DC 4
DEBUG prpl-telegram: error for query 'download part' #6351867698168385536: #400 OFFSET_INVALID
DEBUG prpl-telegram: tgp_msg_on_loaded_document()
bitlbee: tgp-msg.c:598: tgp_msg_on_loaded_document: Assertion `success' failed.
==4397==
==4397== Process terminating with default action of signal 6 (SIGABRT): dumping core
==4397==    at 0x739C04F: raise (in /usr/lib/libc-2.24.so)
==4397==    by 0x739D479: abort (in /usr/lib/libc-2.24.so)
==4397==    by 0x7394EA6: __assert_fail_base (in /usr/lib/libc-2.24.so)
==4397==    by 0x7394F51: __assert_fail (in /usr/lib/libc-2.24.so)
==4397==    by 0xD014AC7: tgp_msg_on_loaded_document (/home/dx/test/named-asd/telegram/carapace/telegram-purple/src/telegram-purple/tgp-msg.c:598)
==4397==    by 0xD02D8AB: download_on_error (/home/dx/test/named-asd/telegram/carapace/telegram-purple/src/telegram-purple/tgl/queries.c:3074)
==4397==    by 0xD01FBBE: tglq_query_error (/home/dx/test/named-asd/telegram/carapace/telegram-purple/src/telegram-purple/tgl/queries.c:416)
==4397==    by 0xD01A954: work_rpc_result (/home/dx/test/named-asd/telegram/carapace/telegram-purple/src/telegram-purple/tgl/mtproto-client.c:865)
==4397==    by 0xD01AF5B: rpc_execute_answer (/home/dx/test/named-asd/telegram/carapace/telegram-purple/src/telegram-purple/tgl/mtproto-client.c:968)
==4397==    by 0xD01A5D1: work_container (/home/dx/test/named-asd/telegram/carapace/telegram-purple/src/telegram-purple/tgl/mtproto-client.c:819)
==4397==    by 0xD01AF07: rpc_execute_answer (/home/dx/test/named-asd/telegram/carapace/telegram-purple/src/telegram-purple/tgl/mtproto-client.c:962)
==4397==    by 0xD01BAEE: process_rpc_message (/home/dx/test/named-asd/telegram/carapace/telegram-purple/src/telegram-purple/tgl/mtproto-client.c:1154)
==4397==    by 0xD01BE18: rpc_execute (/home/dx/test/named-asd/telegram/carapace/telegram-purple/src/telegram-purple/tgl/mtproto-client.c:1208)
==4397==    by 0xD00872A: try_rpc_read (/home/dx/test/named-asd/telegram/carapace/telegram-purple/src/telegram-purple/tgp-net.c:431)
==4397==    by 0xD008908: try_read (/home/dx/test/named-asd/telegram/carapace/telegram-purple/src/telegram-purple/tgp-net.c:476)
==4397==    by 0xD007B43: conn_try_read (/home/dx/test/named-asd/telegram/carapace/telegram-purple/src/telegram-purple/tgp-net.c:227)
==4397==    by 0x1C2B3B: gaim_io_invoke (/lib/events_glib.c:86)
==4397==    by 0x53F8474: g_io_unix_dispatch (/home/dx/test/build/glib2/glib2/src/glib/glib/giounix.c:165)
==4397==    by 0x53996CA: g_main_dispatch (/home/dx/test/build/glib2/glib2/src/glib/glib/gmain.c:3203)
==4397==    by 0x539A652: g_main_context_dispatch (/home/dx/test/build/glib2/glib2/src/glib/glib/gmain.c:3856)
==4397==    by 0x539A856: g_main_context_iterate (/home/dx/test/build/glib2/glib2/src/glib/glib/gmain.c:3929)
==4397==    by 0x539ACF6: g_main_loop_run (/home/dx/test/build/glib2/glib2/src/glib/glib/gmain.c:4125)
==4397==    by 0x1C2A34: b_main_run (/lib/events_glib.c:59)
==4397==    by 0x1BD1DE: main (unix.c:177)

@TomTheDragon You should get fresh debug logs, or maybe just consider this bug fixed and open a different one.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants