-
Notifications
You must be signed in to change notification settings - Fork 722
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Nonterminating processing #1546
Comments
135k lines! 😱 I assume this is happening during There isn't a more verbose debug mode, unfortunately. Usually what I would do is start bisecting the file to see if I can narrow it down to a particular part of the code. But with that many lines, it may just be the size of the file! You might try Otherwise, you may have to skip the file with |
Yeah, I know :) Actually interrupting seems to point to always the same point, with this deep clone that seems to be non terminating. Does that ring a bell? It seems to be always about
When I add logs in |
Yes, that's right. That's the heart of Brakeman's data flow analysis and is generally the cause of most performance issues. It may be building a string, or an array, or a set of branches (e.g. lots of assignments in nested Did |
No, I also usually bisect in the file until I can pinpoint the issue, but this file is really massive :-( What I'd like to understand is "where" in the processed file Brakeman gets into this tarpit. On a related note, it might be useful to also support a timeout mechanism for analyses. |
Based on the stack trace, looks like it's processing the value (right-hand side) of a local variable assignment, inside a method. You could add logging to def process_defn exp
Brakeman.debug "Processing method `#{exp.method_name}` line #{exp.line}" def process_lasgn exp
Brakeman.debug "Processing assignment to `#{exp.lhs}` line #{exp.line}" |
Justin,
|
Justin, The file below suffices to reproduce my problem. Disclaimer: this code is not written by us, and actually is not "written" at all: it's generated. Also, there are several things in it that don't make sense to me. And I had to mascarade it a bit. Cheers! class Foo < ApplicationRecord
def pl_grids_to_xlsx(year,version,opts={})
year ||= Date.today.year
package = Axlsx::Package.new
opts[:xlsx_export] = true
fixed_lines = nil
res_hash = self.foo_grid(year,version,nil,fixed_lines,opts)
res_hash = res_hash.map do |line|
line[:data]["id"] = line[:id]
line[:data] = line[:data].reject{|k,v| k.to_s.starts_with?("PrevYear") || k.to_s.starts_with?("NextYear") }
line[:data] = line[:data].reject{ |k,v| [ "hide_property", ].include?(k.to_s)}
new_line = {}
tmp = TypeCode.find(line[:data]["type_code"].to_i)
new_line["type_code"] = tmp.name.to_s.strip
line[:data] = line[:data].reject{|k,v| k.to_s == "type_code" }
new_line.merge(line[:data])
end
opts = { xlsx_export: true }
opts[:package] = package
opts[:header] = res_hash.first.try(:map){|row| row[0].to_s}
if (opts[:header] != nil && opts[:header].index("id"))
opts[:hidden_cols] = [opts[:header].index("id")]
end
opts[:data] = res_hash.map{|row| row.to_a.map{|t| t[1]}}
package = Foo.to_xlsx(opts)
fixed_lines = nil
res_hash = self.bars_grid(year,version,nil,fixed_lines,opts)
res_hash = res_hash.map do |line|
line[:data]["id"] = line[:id]
line[:data] = line[:data].reject{|k,v| k.to_s.starts_with?("PrevYear") || k.to_s.starts_with?("NextYear") }
line[:data] = line[:data].reject{ |k,v| [].include?(k.to_s)}
end
opts = { xlsx_export: true }
opts[:package] = package
opts[:header] = res_hash.first.try(:map){|row| row[0].to_s}
if (opts[:header] != nil && opts[:header].index("id"))
opts[:hidden_cols] = [opts[:header].index("id")]
end
opts[:data] = res_hash.map{|row| row.to_a.map{|t| t[1]}}
package = BarLine.to_xlsx(opts)
fixed_lines = nil
res_hash = self.bazs_lines_grid(year,version,nil,fixed_lines,opts)
res_hash = res_hash.map do |line|
line[:data]["id"] = line[:id]
line[:data] = line[:data].reject{|k,v| k.to_s.starts_with?("PrevYear") || k.to_s.starts_with?("NextYear") }
line[:data] = line[:data].reject{ |k,v| [].include?(k.to_s)}
new_line = {}
tmp = Foo.find(line[:data]["foo"].to_i)
new_line["name"] = tmp.name.to_s.strip
new_line["matricule"] = tmp.matricule.to_s.strip
line[:data] = line[:data].reject{|k,v| k.to_s == "foo" }
new_line.merge(line[:data])
end
opts = { xlsx_export: true }
opts[:package] = package
opts[:header] = res_hash.first.try(:map){|row| row[0].to_s}
if (opts[:header] != nil && opts[:header].index("id"))
opts[:hidden_cols] = [opts[:header].index("id")]
end
opts[:data] = res_hash.map{|row| row.to_a.map{|t| t[1]}}
package = BazsLine.to_xlsx(opts)
fixed_lines = nil
res_hash = self.Quxs_lines_grid(year,version,nil,fixed_lines,opts)
res_hash = res_hash.map do |line|
line[:data]["id"] = line[:id]
line[:data] = line[:data].reject{|k,v| k.to_s.starts_with?("PrevYear") || k.to_s.starts_with?("NextYear") }.reject{|k,v| k.to_s == "total"}
line[:data] = line[:data].reject{ |k,v| [ "activity_price", "bar_days", "comment", "rap", ].include?(k.to_s)}
new_line = {}
tmp = QuxsLine.find(line[:data]["ssp"].to_i)
new_line["ssp_code"] = tmp.ssp_code.to_s.strip
new_line["activity_code"] = tmp.activity_code.to_s.strip
line[:data] = line[:data].reject{|k,v| k.to_s == "ssp" }
tmp = TypeCode.find(line[:data]["type_code"].to_i)
new_line["type_code"] = tmp.name.to_s.strip
line[:data] = line[:data].reject{|k,v| k.to_s == "type_code" }
new_line.merge(line[:data])
end
opts = { xlsx_export: true }
opts[:package] = package
opts[:header] = res_hash.first.try(:map){|row| row[0].to_s}
if (opts[:header] != nil && opts[:header].index("id"))
opts[:hidden_cols] = [opts[:header].index("id")]
end
opts[:data] = res_hash.map{|row| row.to_a.map{|t| t[1]}}
package = QuxsLine.to_xlsx(opts)
fixed_lines = nil
fixed_lines = self.quuxs_lines_fixed_lines
res_hash = self.quuxs_lines_grid(year,version,nil,fixed_lines,opts)
res_hash = res_hash.map do |line|
line[:data]["id"] = line[:id]
line[:data]["title"] = fixed_lines.select { |fline| fline[:title].gsub('[CUR]', self.currency_symbol) == line[:data]["title"] }.first.try(:[], :id)
line[:data] = line[:data].reject{|k,v| k.to_s.starts_with?("PrevYear") || k.to_s.starts_with?("NextYear")}
line[:data] = line[:data].reject{ |k,v| [].include?(k.to_s)}
end
opts = { xlsx_export: true }
opts[:package] = package
opts[:header] = res_hash.first.try(:map){|row| row[0].to_s}
if (opts[:header] != nil && opts[:header].index("id"))
opts[:hidden_cols] = [opts[:header].index("id")]
end
opts[:data] = res_hash.map{|row| row.to_a.map{|t| t[1]}}
package = QuuxsLine.to_xlsx(opts)
fixed_lines = nil
fixed_lines = self.quuuxs_lines_fixed_lines
res_hash = self.quuuxs_lines_grid(year,version,nil,fixed_lines,opts)
res_hash = res_hash.map do |line|
line[:data]["id"] = line[:id]
line[:data]["title"] = fixed_lines.select { |fline| fline[:title].gsub('[CUR]', self.currency_symbol) == line[:data]["title"] }.first.try(:[], :id)
line[:data] = line[:data].reject{|k,v| k.to_s.starts_with?("PrevYear") || k.to_s.starts_with?("NextYear")}
line[:data] = line[:data].reject{ |k,v| [].include?(k.to_s)}
end
opts = { xlsx_export: true }
opts[:package] = package
opts[:header] = res_hash.first.try(:map){|row| row[0].to_s}
if (opts[:header] != nil && opts[:header].index("id"))
opts[:hidden_cols] = [opts[:header].index("id")]
end
opts[:data] = res_hash.map{|row| row.to_a.map{|t| t[1]}}
package = QuuuxLine.to_xlsx(opts)
fixed_lines = nil
fixed_lines = self.foes_lines_fixed_lines
res_hash = self.foes_lines_grid(year,version,nil,fixed_lines,opts)
res_hash = res_hash.map do |line|
line[:data]["id"] = line[:id]
line[:data]["title"] = fixed_lines.select { |fline| fline[:title].gsub('[CUR]', self.currency_symbol) == line[:data]["title"] }.first.try(:[], :id)
line[:data] = line[:data].reject{|k,v| k.to_s.starts_with?("PrevYear") || k.to_s.starts_with?("NextYear")}
line[:data] = line[:data].reject{ |k,v| [].include?(k.to_s)}
end
opts = { xlsx_export: true }
opts[:package] = package
opts[:header] = res_hash.first.try(:map){|row| row[0].to_s}
if (opts[:header] != nil && opts[:header].index("id"))
opts[:hidden_cols] = [opts[:header].index("id")]
end
opts[:data] = res_hash.map{|row| row.to_a.map{|t| t[1]}}
package = FoesLine.to_xlsx(opts)
fixed_lines = nil
fixed_lines = self.honks_lines_fixed_lines
res_hash = self.honks_lines_grid(year,version,nil,fixed_lines,opts)
res_hash = res_hash.map do |line|
line[:data]["id"] = line[:id]
line[:data]["title"] = fixed_lines.select { |fline| fline[:title].gsub('[CUR]', self.currency_symbol) == line[:data]["title"] }.first.try(:[], :id)
line[:data] = line[:data].reject{|k,v| k.to_s.starts_with?("PrevYear") || k.to_s.starts_with?("NextYear")}
line[:data] = line[:data].reject{ |k,v| [].include?(k.to_s)}
end
opts = { xlsx_export: true }
opts[:package] = package
opts[:header] = res_hash.first.try(:map){|row| row[0].to_s}
if (opts[:header] != nil && opts[:header].index("id"))
opts[:hidden_cols] = [opts[:header].index("id")]
end
opts[:data] = res_hash.map{|row| row.to_a.map{|t| t[1]}}
package = HonksLine.to_xlsx(opts)
end
end |
Hi @akimd, thank you for providing this code sample! I can reproduce the slow behavior. I agree nothing really jumps out at me, so I'll need to take a closer look. |
Just as a check in here: I know what's happening. It's hard to see, but after staring at the code for a while I realized there is a thread through the code where I don't know yet how to fix the issue in Brakeman, but did want to let you know I have not given up yet. |
Justin, If there's no way to get this to work properly, maybe a new type of timeout would be useful. Cheers! |
Fixed with #1820 ⚡ |
Great news, thanks a lot! |
Background
Brakeman version: 4.10.1
Rails version: 6
Ruby version: 3
Issue
Hi,
Brakeman appears to be nonterminating and consuming lots of cpu and space on a project. I managed to restrict this to a single model file, which is huge (135klines, it's generated code). I would like to restrict even further to see where, in its chain of treatments, brakeman seems to be stuck. Is there some feature like a more verbose of --debug for such cases? Do you have advice on how to restrict even further the scope of the problem?
Thanks in advance!
The text was updated successfully, but these errors were encountered: