Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Printing the url list to txt file with output-file #10

Closed
wants to merge 0 commits into from

Conversation

ramazansancar
Copy link
Contributor

upd: The url list of all request is saved in the ./output/output.txt file.
feat: Print query results as array with output-json
upg: Dependencies upgraded.
upd: Readme new flags added.

I also checked it with 17k rows of data. It worked flawlessly.

I hope I made bad code. I recently started working to improve myself in Go. Thanks @atomicptr

@ramazansancar
Copy link
Contributor Author

The requirement here is met. #4

Copy link
Owner

@atomicptr atomicptr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey thanks for the pull request!

I have added some small change requests but otherwise it looks good! Great job :)

pkg/crawler/log.go Outdated Show resolved Hide resolved
pkg/cli/crawl/commons.go Outdated Show resolved Hide resolved
pkg/crawler/log.go Outdated Show resolved Hide resolved
pkg/crawler/log.go Outdated Show resolved Hide resolved
pkg/crawler/log.go Outdated Show resolved Hide resolved
pkg/crawler/log.go Outdated Show resolved Hide resolved
pkg/crawler/log.go Outdated Show resolved Hide resolved
@ramazansancar
Copy link
Contributor Author

When I get to the computer, I will ask about the necessary corrections and what some of the requirements mean in detail. Thanks @atomicptr

@ramazansancar
Copy link
Contributor Author

I also updated the readme and help sections. There will be no such problem. Thanks for the information and collaboration. @atomicptr

Best regards from Turkey

@ramazansancar
Copy link
Contributor Author

There seems to be a problem with the records after scanning large xml files :(

image

Seems like it could solve this problem:

if data["status"] != nil || data["url"] != nil || data["time"] != nil || data["duration"] != nil {
	status = data["status"].(float64)
	url = data["url"].(string)
	time = data["time"].(float64)
	duration = data["duration"].(float64)

	_, err = file.WriteString(fmt.Sprintf("%d\t%s\t%d\t%d", int(status), url, int(time), int(duration)) + "\n")
} else {
	_, err = file.WriteString(message + "\n")
}

I may need some time to make sure the code works.

@ramazansancar
Copy link
Contributor Author

This time problem solved :)

image

I thought it would be good to relocate the error message so it can be parsed.

image
I'm sending the final version this way.

The command I used for testing: (It contains about 17000 pages.)

crab crawl:sitemap https://www.agroworlddergisi.com/sitemap_index.xml --num-workers=500 --http-timeout=60s --output-file ./output/output.txt --output-json ./output/output.json

pkg/crawler/log.go Outdated Show resolved Hide resolved
pkg/crawler/log.go Outdated Show resolved Hide resolved
pkg/crawler/log.go Outdated Show resolved Hide resolved
pkg/crawler/log.go Outdated Show resolved Hide resolved
@atomicptr
Copy link
Owner

Thanks for the changes, I found a few more things but otherwise it looks great :)

@ramazansancar
Copy link
Contributor Author

I have completed the final changes required. I am updating the code and sending it. @atomicptr

@atomicptr
Copy link
Owner

whoops something went wrong here 👀

@atomicptr
Copy link
Owner

I kinda fucked something up by using a gh cli tool not quite sure what yet although I merged the changes with #11

Thank you very much for the PR! :)

@ramazansancar
Copy link
Contributor Author

Thanks for the fixes and Merge. @atomicptr

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants