New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug report for OKC-PAT #3

Closed
safadancer opened this Issue Apr 11, 2013 · 6 comments

Comments

Projects
None yet
3 participants
@safadancer

safadancer commented Apr 11, 2013

I installed the PAT for OKC on Firefox last night and noticed that my OKC immediately became incredibly slow and laggy. I slowly slogged through the 15-question test to activate the PAT, and every time I clicked a radio button, it took several seconds to even register. At about question 3, Firefox froze completely and I had to Force Quit it. When I re-opened, it brought up my previous tabs, including OKC, which remained incredibly slow and laggy as I worked through the rest of the questions. Firefox would not allow me to click through to other tabs, either, as the whole browser was just very confused. When I finished the questions and attempted to browse through OKC suggestions to see if PAT was working, OKC was so slow that I thought Firefox had crashed again and was about to Force Quit when it sloooooowly started to load a photo for a user. I quit Firefox and switched to Chrome to test OKC, and it works fine there. This morning I went back to OKC on Firefox to test if it was still laggy and it definitely is; it took about 5 seconds for each click to register, and I closed the OKC tab before it could crash again.

@focalintent

This comment has been minimized.

Show comment
Hide comment
@focalintent

focalintent May 8, 2013

Do you see this problem when you run in firefox - or other users, for that matter? (I don't currently have it handy - I got sick and tired of its CPU and memory usage issues and now flip between safari/chrome).

Also - know offhand what version of firefox they were using?

focalintent commented May 8, 2013

Do you see this problem when you run in firefox - or other users, for that matter? (I don't currently have it handy - I got sick and tired of its CPU and memory usage issues and now flip between safari/chrome).

Also - know offhand what version of firefox they were using?

@meitar

This comment has been minimized.

Show comment
Hide comment
@meitar

meitar May 8, 2013

Owner

Yes, @focalintent. I can verify this is a problem (at least for me), and I've gotten additional reports in other channels from others that confirm it is for them, too. Sadly, I don't know what version of Firefox @safadancer was using, and I don't have any additional useful data from the other reporters, but I've verified this issue on the latest stable Firefox (release channel, version 20.0) with the latest stable Greasemonkey (version 1.8).

Owner

meitar commented May 8, 2013

Yes, @focalintent. I can verify this is a problem (at least for me), and I've gotten additional reports in other channels from others that confirm it is for them, too. Sadly, I don't know what version of Firefox @safadancer was using, and I don't have any additional useful data from the other reporters, but I've verified this issue on the latest stable Firefox (release channel, version 20.0) with the latest stable Greasemonkey (version 1.8).

@focalintent

This comment has been minimized.

Show comment
Hide comment
@focalintent

focalintent May 8, 2013

So I did a quick run through with firebug, and it looks like, if its profiler wasn't being a complete liar to me, that this might be one possible culprit. ldata.answers.length can get really high, really quickly - just from a couple minutes of browsing around, it had close to 200 items in it. When data.answers.length was 5 that meant 1000 tims through that inner loop. The higher those pairs of lengths are, the greater the loop count.

                    for (var i = 0; i < ldata.answers.length; i++) {
                        for (var x = 0; x < data.answers.length; x++) {
                            // If we have already saved this data locally,
                            if (data.answers[x].qid === ldata.answers[i].qid) {
                                OKCPAT.log(screenname + '\'s scraped QID ' + data.answers[x].qid + ' matches their locally saved QID ' + ldata.answers[i].qid);
                                // remove that QID from the list of QIDs to save.
                                for (var y = 0; y < answers_to_add.length; y++) {
                                    OKCPAT.log('Removing QID ' + answers_to_add[y].qid + ' from ' + screenname + '\'s data-to-save.');
                                    answers_to_add.splice(answers_to_add.indexOf(answers_to_add[y]), 1);
                                }
                            }
                        }
                    }

If you can index ldata.answers by qid instead of index, then you can get rid of the outer loop, and instead do if(ldata.answers[data.answers[x].qid]) ... to see if you have a qid in the local table. If not, i have a couple of other possible ideas that you can try to look at - I have to run out and I'm not sure when I can get back to looking at this, possibly not until tomorrow, but figured i'd pass along a start.

focalintent commented May 8, 2013

So I did a quick run through with firebug, and it looks like, if its profiler wasn't being a complete liar to me, that this might be one possible culprit. ldata.answers.length can get really high, really quickly - just from a couple minutes of browsing around, it had close to 200 items in it. When data.answers.length was 5 that meant 1000 tims through that inner loop. The higher those pairs of lengths are, the greater the loop count.

                    for (var i = 0; i < ldata.answers.length; i++) {
                        for (var x = 0; x < data.answers.length; x++) {
                            // If we have already saved this data locally,
                            if (data.answers[x].qid === ldata.answers[i].qid) {
                                OKCPAT.log(screenname + '\'s scraped QID ' + data.answers[x].qid + ' matches their locally saved QID ' + ldata.answers[i].qid);
                                // remove that QID from the list of QIDs to save.
                                for (var y = 0; y < answers_to_add.length; y++) {
                                    OKCPAT.log('Removing QID ' + answers_to_add[y].qid + ' from ' + screenname + '\'s data-to-save.');
                                    answers_to_add.splice(answers_to_add.indexOf(answers_to_add[y]), 1);
                                }
                            }
                        }
                    }

If you can index ldata.answers by qid instead of index, then you can get rid of the outer loop, and instead do if(ldata.answers[data.answers[x].qid]) ... to see if you have a qid in the local table. If not, i have a couple of other possible ideas that you can try to look at - I have to run out and I'm not sure when I can get back to looking at this, possibly not until tomorrow, but figured i'd pass along a start.

meitar added a commit that referenced this issue May 8, 2013

Issue #3: Remove nested loop by indexing data using `qid` in memory.
Thanks to focalintent who identified a performance bottleneck. More
information is available at the GitHub.com issue tracker at
#3
@meitar

This comment has been minimized.

Show comment
Hide comment
@meitar

meitar May 8, 2013

Owner

Thanks so much for looking into this, @focalintent. I've taken your suggestion and committed 9e5b25c in a new 0.2.4 branch. I think this is better. What do you think?

Owner

meitar commented May 8, 2013

Thanks so much for looking into this, @focalintent. I've taken your suggestion and committed 9e5b25c in a new 0.2.4 branch. I think this is better. What do you think?

@focalintent

This comment has been minimized.

Show comment
Hide comment
@focalintent

focalintent May 8, 2013

The debug console is reporting an error - unexpected JSON character around line 362/363 - unfortunately the json parser isn't giving much more in the way of useful information. Also doesn't seem to be impacting anything working.

Other than that though - it definitely seems to be performing better. I wonder if that JSON parsing error is adding to some of the still slower than chrome that you're seeing?

focalintent commented May 8, 2013

The debug console is reporting an error - unexpected JSON character around line 362/363 - unfortunately the json parser isn't giving much more in the way of useful information. Also doesn't seem to be impacting anything working.

Other than that though - it definitely seems to be performing better. I wonder if that JSON parsing error is adding to some of the still slower than chrome that you're seeing?

@meitar

This comment has been minimized.

Show comment
Hide comment
@meitar

meitar May 8, 2013

Owner

Yeah, that error is because GAE currently responds with an error instead of with JSON, so I'm inclined to believe it's unrelated to this issue.

Thanks for taking a look at this! I'll be releasing version 0.2.4 with this patch shortly.

Owner

meitar commented May 8, 2013

Yeah, that error is because GAE currently responds with an error instead of with JSON, so I'm inclined to believe it's unrelated to this issue.

Thanks for taking a look at this! I'll be releasing version 0.2.4 with this patch shortly.

@meitar meitar closed this May 8, 2013

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment