Brackets hangs when I do Find in Files (search argument "brackets") in the brackets-app folder (which has brackets embedded). I'm guessing that Brackets is choking on a very large and/or binary file.
We need to repro on Mac if possible to make sure this isn't Win only - assigning to Randy for now.
Does not seem to repro on Mac, so tagging Win only.
I narrowed the problem down to a console.log file in my brackets-app/bin/win folder. The file is textual (not binary), ~36M bytes in size, has ~720K lines of text, and I can open it in Brackets. I think the problem with Find in Files is that is has ~200k occurrences of the string "brackets". If I search on something I know is only in the file a 800 or so times (e.g. "uncaught") it doesn't hang.
This points to another problem that we let our log files grow infinitely.
I submitted a pull request to remove the console.log file in brackets-shell. Of course, this won't help with the file you have in brackets-app, but it should help prevent this in the future.
I think we can reduce this to a low priority bug. We should be able to handle > 200k occurrences, but that is an edge case.
There's a cutoff after 100 results, so in theory anything more than that should make no difference. But looking at the code, it'll actually gather & store all 200k results first, applying the cutoff only when rendering the results table. So I bet we could fix this just by having the lower-level search code check FIND_IN_FILES_MAX itself (given Randy's finding that opening & searching the giant file is fast so long as there are no/few results in it).
p.s. Randy, should we take the 'Win only' tag back off? I assume if your Mac machine had a 35 MB file on it, things would also be very slow...?
We could stop searching after we get FIND_IN_FILES_MAX matches, but then the user wouldn't know the total number of matches. There is a big difference between 150 matches and 15,000 matches.
Another possibility is putting a separate cap on total matches. It seems reasonable to cap that at 10,000 or so.
Oh, and it seems like we may also be able to bump up the FIND_IN_FILES_MAX number. I set it way down at 100 because the original implementation was really slow. We could probably safely bump it up to 500 or so, although we should do some testing on slower machines to make sure it performs okay.
I put up a pull request for the limiting each file to FIND_IN_FILES_MAX. Yes, it will limit the total hits, so I thought about stopping at 10 * FIND_IN_FILES_MAX instead.
I ran into another hang with a 145M byte .pack file in brackets-app/.git/objects/pack. Seems like we should not be searching .git directories, but maybe this is a unique case.
I tried bumping up FIND_IN_FILES_MAX to 500, and resizing the panel height on my machine is very sluggish, so I guess we should wait to change this.