New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Filter watching clashing with youtube link parsing and shutting down all network activity #342
Comments
Of the updates Only Jsoup and OkHttp are suspect. OkHttp is very suspect due to the major version change. I'll see what Jsoup does, and if it's not really a big deal then I'll end up only reverting the OkHttp update. Otherwise both of those will get rolled back. Nothing else pops out. |
Jsoup looks fine, but I'm going to roll back OkHttp, as it has too many changes for me to manually look through. |
It's not OkHTTP as far as I can tell. I'm rolling back JSoup and Gif-Drawable as well. |
Yeah, it was one of Jsoup or Gif-drawable. I'll refrain from updating those. |
It's still occurring, the only other one that deals with loading is RxJava, which is the only thing I haven't rolled back. I'll try that, and if it doesn't work, then everything is getting rolled back. |
Okay, well I've got good news and bad news. Good news is that none of the library updates actually did anything bad, so we're a-ok to upgrade. |
Yeah, I have no idea. It's so strangely irreproducable that I can't figure out what's wrong. I want to blame 4chan or Cloudflare, but I need to see if it happens on another site to check that. And without a proper testing method for making it fail to load, I don't have any way of fixing it. Reading through all of the code from 4.5.4 until now shows me nothing that would cause a loading issue this bad, because none of it was messing with network request parsing at all. |
It's possible that this issue goes back as far as 4.6.0; 4.5.4 doesn't seem to have any issues with loading. There's a strange glitch where if you rapidly tap on the thread status cell, it'll momentarily display "Loading in 9s" despite that it should be saying "Loading" without a time. |
Okay I have a method of reproduction, and this only occurs on 4.6.1+. |
I'm going to go through every single commit post 4.6.1 and add it into a branch and test it, and see where the issue is, because reading through it I clearly couldn't find any problems so it's some strange interaction between disparate code blocks that I can't figure out. |
Well, it looks like it's Youtube title parsing. Going to figure out WHY now. |
I have no fucking clue why it breaks. All I know is that disabling Youtube titles prevents the issue from occurring. Going through ALL the logic it makes perfect sense, but then something is fucking up somewhere and I can't figure it out. There's some case where it breaks everything, but I don't know WHERE. And it only happens sometimes as well. Slapping that Heisenbug tag back onto this. |
I've just tossed out all the Youtube title parsing code. While a nice touch, I'm not going to sacrifice application stability for something that small and something so inconsistently buggy through normal use (it broke with no Youtube links at all! WHAT??), and with no good routes for debugging it's not exactly fixable. |
I have finally looked at the code and found something interesting. Look at this line - 1c158fe#diff-89ed1fbdeab5062aa485da49f5908519L103 Basically what it does is it waits INFINITELY until the request completes (Future.get() without any parameters). But what if it won't complete? Then the app will hang forever (exactly what is happening). So the possible fix for this is to set a timeout parameter (like 10-15 seconds per request). My assumption is that the regexp sometimes fails to parse a youtube link and instead parses something similar (it may not be in the comment but in the page's html code itself) and then the request gets executed and since it's not a youtube link it just waits forever. Or maybe youtube sometimes just doesn't return the response (who knows?) or returns it but with a huge delay (anti-spam?). Then you are parsing all of the links sequentially in a signle thread. So what if there are 100 youtube links in a thread and each takes 5 seconds to get it's title (bad connection)? So the other possible fix is to use a thread pool (or just use RxJava, it's a perfect tool for tasks like this one) with like 10 threads to send requests and retrieve link titles. I hope this helps because this feature is really cool. |
The regex is way too specific, and it's parsing it node-by-node anyways so it shouldn't need to match over a large amount of text, max 2000 characters or so. And as I noted above, even with threads that had no youtube links in them at all it should skip over the entire input string and just return a new SpannableString instance of the input, which should be exactly the same behavior as before. But it would still hang and lockup all network activity, even then. It wasn't a timeout issue because on emulator connected to a very good and stable internet connection it would still lockup. Putting the API URL into a browser would get me the right result without any timeouts, so there were no connection issues that would have resulted in timeouts. If it just never returns a response or with a huge delay then that's just a shitty API that I don't want to deal with. And I know that if you go over the quota then you'll get a different response, but it won't timeout. That was what the "Unable to get title" text was for me and some people. And I can't do it asynchronously, I did think about doing that. The actual text of the comment is modified in that method and is required to be fully set (i.e. no further text changes) for further parsing, which would screw everything up. The parsing of each post in a thread is already threaded, but I can't add threading on top of that, not that it would help much. If you want to try and fix it, feel free, but basically everything I tried to debug it didn't work or made no sense at all. I even thought it was the ImageSpan on top of the PostLinkable at one point, but that's not it either. I have no goddamn clue what the hell was screwing up. It would work like 50% of the time, and then another 50% of the time it would just completely lock up the entire network queue. I would love to believe that it was because it wasn't timing out, but it was happening even after only like 10 thread reloads, very consistently. Removing it solved it, very consistently. That's all the information I have. I would love for it to work, but I just have no clue. I need help from someone far more experienced than me here. |
Well it has actually happened for me but everything still worked as usual I had no infinite loading screens. I tried updating the thread a few times but it still worked normally. Here is the thread, btw - https://boards.4chan.org/g/thread/72675545#p72675545 |
It might be Android version specific, that's the only thing I didn't test. Android 9 is the only thing I tested with, both emulator and physical device. If that's the case then that's bullshit |
And yeah, the regex isn't great. If there was a way to canonicalize the links or something that would help |
Maybe we should just try using 4chanx's regex for youtube links? It works for them and there are apparently no such issues. |
I'm reopening this, as I've determined this is an issue with filter watching clashing with the parsing somehow. It's coming back as an experimental feature for that fact; also due to the above notes, the regex is changing to 4chanX's, as it seems to work better in a number of cases, with one change for #335. |
Update: I don't know how or why, but this change I made to better allow for the FilterWatchManager instance to do its stuff in the background may have fixed this problem. I'm going to leave this as a experimental feature just to be sure though. |
Please try to see if you can figure out why this is happening. I can confirm it's not my internet because I downgraded to 4.6.1 to check, and 4.6.1 loads much faster.
I have a feeling it's one of the library updates...
The text was updated successfully, but these errors were encountered: