-
-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Regression in 0.0.31? #78
Comments
Can you provide your test file with URLs for me to check? |
oh lolz this is usrse jobs right? |
This is what I'm getting with your foo.md
|
In your run I don't see that it's detected any files (e.g., files: [] is empty) |
I think running it the same way as you works for me too: $ urlchecker check foo/
original path: foo/
final path: foo/
subfolder: None
branch: main
cleanup: False
file types: ['.md', '.py']
files: []
print all: True
verbose: False
urls excluded: []
url patterns excluded: []
file patterns excluded: []
force pass: False
retry count: 2
save: None
timeout: 5
https://uwhires.admin.washington.edu/ENG/Candidates/default.cfm?szCategory=jobprofile&szOrderID=210005
https://uwhires.admin.washington.edu/ENG/Candidates/default.cfm?szCategory=jobprofile&szOrderID=209999
https://uwhires.admin.washington.edu/ENG/Candidates/default.cfm?szCategory=jobprofile&szOrderID=209997
🎉 All URLS passed!
Could you provide some way to reproduce what is happening so I could help? |
But that invocation worked OK in 0.0.30. |
Incidentally, what version of Python are you using? |
ahh I see this: 2022-07-26 20:55:30,171 - urlchecker - ERROR - Error running task This means the multiprocess workers had an error, so to get to the bottom of this what you can do is pip install IPython,
here: urlchecker-python/urlchecker/core/check.py Line 210 in dad0bdc
and here: urlchecker-python/urlchecker/core/urlproc.py Line 208 in dad0bdc and then you'll want to manually run the function (so it doesn't get run by the worker, e.g.,) kwargs = {
"file_name": file_name,
"exclude_patterns": exclude_patterns,
"exclude_urls": exclude_urls,
"print_all": self.print_all,
"retry_count": retry_count,
"timeout": timeout,
"port": ports.pop(0),
}
check_task(**kwargs) That will enter you into the second function, then manually run the check for each url and inspect what happens. Report back here and we will try to work on it! |
I have a fairly new one: $ python --version
Python 3.9.12 I hope it's not that! If it is we can find out with the check above. Let me know what you find! |
And when we do figure it out, we should probably better capture this particular error so it's clearer to you what happened! I missed that error message the first time I looked at it. |
hey @crd477 ! I think I was able to reproduce your error - would you mind testing both shown commands at the branch #80 (comment)? Thank you! |
Fixed with #82 |
Yes, sorry I didn't get back to you sooner - I got busy with something else and then I'm mostly AFK this week.
Just to be clear, the fact that these specific URLs fail is OK and is not related to my issue report. Thanks for the fix! |
Hello, I've observed what seems to be a regression in the latest release.
With version 0.0.30, the URLs in the test file I have there are correctly flagged as problematic but 0.0.31 doesn't appear to work at all.
Maybe I'm doing something wrong?
The text was updated successfully, but these errors were encountered: