You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was testing the framework with -t 1000 -M 1000 threads and I've seen sometimes there is a deadlock in this function. Now the framework has a small SQLite database which I used for the web interface and the API. To prevent the deadlock we must remove flock and write the logs directly to the database hosts_log table. we may use the threads for inserting them (to avoid decreasing speed) and this part of core/log.py and do it while inserting logs in the database. we may use scan_id hash for selecting the logs and create the report log file too.
# this part must move into __log_into_fileifapi_flagis0:
info(messages(language, 171))
hosts= []
forloginJSON_Data:
iflog["HOST"] notinhosts:
hosts.append(log["HOST"])
forhostinhosts:
forsminscan_method.rsplit(','):
remove_old_logs(host, sm, scan_id, language, api_flag)
ifapi_flagis0:
info(messages(language, 170))
forloginJSON_Data:
submit_logs_to_db(language, api_flag, log)
__log_into_file also should be renamed to __log_into_db. and locate in api.__databasefile.
let me know if anyone has time to work on this.
Regards.
The text was updated successfully, but these errors were encountered:
Hello,
I was testing the framework with
-t 1000 -M 1000
threads and I've seen sometimes there is a deadlock in this function. Now the framework has a small SQLite database which I used for the web interface and the API. To prevent the deadlock we must removeflock
and write the logs directly to the databasehosts_log
table. we may use the threads for inserting them (to avoid decreasing speed) and this part ofcore/log.py
and do it while inserting logs in the database. we may usescan_id
hash for selecting the logs and create the report log file too.__log_into_file
also should be renamed to__log_into_db
. and locate inapi.__database
file.let me know if anyone has time to work on this.
Regards.
The text was updated successfully, but these errors were encountered: