Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dump and CSV file #841

Open
WilliJoin opened this issue Sep 30, 2014 · 4 comments
Open

Dump and CSV file #841

WilliJoin opened this issue Sep 30, 2014 · 4 comments

Comments

@WilliJoin
Copy link

  1. Sqlmap overwrite previous dumped table if start the same scan again, needed a switch which let deside user verwrite or not
  2. If i dump table for example named "test" where are 70.000 entries and start dump the same table or use --dump-all, sqlmap got stucked on this table for long time, needed to skip already full dumped tables, and if a table not full dumped then start dump from last entry
  3. If OS where running python is restarting, shutdown or get Blue Screen of Death, now imaging what sqlmap runned for dumping 1 tables with 1.000.000 entries and do this for last 5 days, and dump most of them but not all, so when start sqlmap after thoose issues sqlmap cant resume scaning, because sqlmap in this way do everithing from beginning. But session.sqlite seems like contain already dumped info but ..... So needed what sqlmap saving to dump file not to session.sqlite each 10 min and not waiting for full dump to save results to file.
@stamparm
Copy link
Member

  1. can do this
  2. and if a table not full dumped then start dump from last entry -> sqlmap resumes queries stored in session file to minimize number of repeated requests. We can't use "resume from last entry" as sqlmap sees everything as query<->reply
  3. can't write to a dump file before the last row is retrieved because of formatting and filtering.

I can feel your frustration, but writing a partial table dump which would be totally different from the end formatted and filtered output would just start new issues.

@mukareste
Copy link

I am trying to figure out a legitimate reason to dump 70K entries. Is this a requirement of some sort of a penetration test?

@bdamele bdamele added this to the 1.0 milestone Feb 25, 2015
@bdamele bdamele self-assigned this Feb 25, 2015
@bdamele bdamele changed the title Extensions Dump and CSV file Feb 25, 2015
@ghost
Copy link

ghost commented Jun 4, 2016

  1. This also happens when extracting specific columns only and restarting. +1 for the overwrite confirmation.

@stamparm
Copy link
Member

stamparm commented Jun 5, 2016

@lukapusic with latest commit old table dumps are "backup"-ed. For example, after a couple of runs:

$ ll /home/stamparm/.sqlmap/output/testasp.vulnweb.com/dump/acuforum/users.csv*
-rw-rw-r-- 1 stamparm stamparm 921 Lip  5 12:36 /home/stamparm/.sqlmap/output/testasp.vulnweb.com/dump/acuforum/users.csv
-rw-rw-r-- 1 stamparm stamparm 854 Lip  5 12:34 /home/stamparm/.sqlmap/output/testasp.vulnweb.com/dump/acuforum/users.csv.1
-rw-rw-r-- 1 stamparm stamparm 921 Lip  5 12:36 /home/stamparm/.sqlmap/output/testasp.vulnweb.com/dump/acuforum/users.csv.2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants