A simple URL checker which gets simple URL, deletes doubles and save the result to new file
I What does it do?
- Programme analyzes file in txt. It deletes http, www and stuff that is between slashes.
- Next it gets "clean" url e.g. site.com
- After that programme deletes all duplicate URLs and writes a list into the new file.