Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

keep saying 2018-03-29 03:55:35 [parso.python.diff] DEBUG: diff parser start when I'm typing #241

Open
NewUserHa opened this issue Mar 28, 2018 · 24 comments

Comments

@NewUserHa
Copy link

python3.65rc1
I just did pip install ptpython
then scrapy shell ...
it says when I'm typing re[sponse]

>>> re2018-03-29 03:55:35 [parso.python.diff] DEBUG: diff parser start
2018-03-29 03:55:35 [parso.python.diff] DEBUG: diff parser calculated
2018-03-29 03:55:35 [parso.python.diff] DEBUG: diff: line_lengths old: 1, new: 1
2018-03-29 03:55:35 [parso.python.diff] DEBUG: diff replace old[1:1] new[1:1]
2018-03-29 03:55:35 [parso.python.diff] DEBUG: parse_part from 1 to 1 (to 0 in part parser)
2018-03-29 03:55:35 [parso.python.diff] DEBUG: diff parser end
 [F4] 2018-03-29 03:55:35 [parso.python.diff] DEBUG: diff parser start                       [F2] Menu - CPython 3.6.5
2018-03-29 03:55:35 [parso.python.diff] DEBUG: diff parser calculated
2018-03-29 03:55:35 [parso.python.diff] DEBUG: diff: line_lengths old: 1, new: 1
2018-03-29 03:55:35 [parso.python.diff] DEBUG: diff replace old[1:1] new[1:1]
s2018-03-29 03:55:35 [parso.python.diff] DEBUG: parse_part from 1 to 1 (to 0 in part parser)
2018-03-29 03:55:35 [parso.python.diff] DEBUG: diff parser end

there's completion popup, but it's keeping saying these DEBUG. so it's unuseable.

it's fine I use ptpython in my last installing with 3.5 which I'm upgraded from to 3.65rc1

@NewUserHa
Copy link
Author

the newest ipython 6.2.1 has the same issue. and it's reported old ipython has no problem.
I'm considering trying old ptpython.

@NewUserHa
Copy link
Author

ptpython 0.41-0.36, all have this issue.

@Granitosaurus
Copy link

Granitosaurus commented May 3, 2018

Seems like it's an issue with parso dependancy. I've submitted a PR to their repo to ammend this. For the time being you can fix this by increasing log level:

import logging; logging.getLogger('parso.python.diff').setLevel('INFO') 

You can probably add this somewhere in ptpython config to execute on startup.

On the side note: I think ptptyhon should increase loglevel throughout the board and have some sort of flag for that:

ptpython --log-level ERROR

would be cool to have something like this that would block any logging that is not an error.

EDIT: digging further this seems to be cause by scrapy shell overriding any default log handler. Making any unset loggers to swear at debug level.

@davidhalter
Copy link

I don't think using the DEBUG log level is wrong. It's pretty much the lowest you can use. If anything prints all log levels, that's just wrong in that library IMO.

The debug level just in general contains a lot of non-sense information.

@Granitosaurus
Copy link

@davidhalter

I don't think using the DEBUG log level is wrong.

I don't think debug should be default in any case ever. The lowest logging level in Python is 0 which is NOTSET level. I'm not exactly sure how NOTSET works but it's generally not good idea to leave logging without a handler and unset level - then all kinds of issues pop up in other tools that control or wrap around loggers.

My point being that parso needs to set it's logging to INFO explicitly for it to play nicely with ptpython and scrapy shell commands or add a proper log handler.

@davidhalter
Copy link

@Granitosaurus I'm happy to be convinced otherwise, but I've checked three big libraries: requests, django and paramiko. They all use logging pretty extensively and never use setLevel. So I'm pretty sure you are wrong and the issue is not with parso.

Show me other libraries that do this and I'm happy to reconsider.

@NewUserHa
Copy link
Author

I heard 'DEBUG' log level should only be used in production environment.

however, this issue make ptpython unable to use anyways. so..

@blueyed
Copy link

blueyed commented May 19, 2018

FWIW: I could not reproduce it using ptpython master and pip install -e .ing it.

@NewUserHa
Copy link
Author

there was no issue when I was using python3.5 + ipython + ptpython
but there's when I uninstall 3.5 and install a brand new 3.6 then pip install ptpython

@blueyed
Copy link

blueyed commented May 26, 2018

Clear steps for reproducing this would be good, since I could not. Also copy'n'paste the terminal output then maybe for clarity.
I am using Arch Linux's Python 3.6 by default, and IIRC have created a virtualenv with it where I pip-installed ptpython then.
Anyway, not a user, just came here from the parso issue tracker.

@williamww19
Copy link

Is there any update on the issue? I have to uninstall my anaconda and install again to clear this thing

@NewUserHa
Copy link
Author

@blueyed can't you reproduce it by run scrapy shell?

@blueyed
Copy link

blueyed commented Oct 7, 2019

@NewUserHa
Sorry, I've tried to help/look into that back in May apparently, but only via parso - I am not using ptpython myself.

@NewUserHa
Copy link
Author

ok. I thought you were a contributor to this repo.

the issue still is there, and I haven't found code which changes logging level of parso from scrapy.

@davidhalter
Copy link

@NewUserHa The issue is probably that the scrapy shell configures logging output for DEBUG on the root logger.

@NewUserHa
Copy link
Author

but I didn't find any sign of it in code of scrapy.
are you sure it has nothing with ptpython and parso? then I close this issue and open one in scrapy repo

@davidhalter
Copy link

It's definitely not parso and also not jedi. Might be something about your environment as well or another dependency of ptpython/scrapy or an import that you make. It's in there somewhere, keep looking ;-)

@blueyed
Copy link

blueyed commented Oct 8, 2019

What about putting a pdb.set_trace() into the logging module?

@Gallaecio
Copy link

By default, the Scrapy logger uses DEBUG log level. That’s the root of the issue, which also affects ipython: ipython/ipython#10946

So for Scrapy it’s easy to workaround (you change the level to INFO in the settings or use -L INFO in the command line).

However, maybe this is something that should be handled by the shell. I mean, maybe the running shell should not make assumptions about the current logging level, and instead capture and discard all logging recorded during the internal code execution of the shell itself (e.g. during autocompletion).

@davidhalter
Copy link

davidhalter commented Oct 14, 2019

It might be a good idea to just disable the parso logger in Scrapy/ptpython altogether. I don't really think that it is needed in any case.

When people are enabling logging, they usually want to see the logs about the stuff they import and not the stuff that scrapy/ptpython imports.

@NewUserHa
Copy link
Author

NewUserHa commented Oct 15, 2019

scrapy shell will NOT affect ipython 7.0.1.

@lhuaizhong
Copy link

It bothers me for a long time, not found an ideal solution, after I uninstall parso module, all is well!
pip uninstall parso

@Anurag360007
Copy link

dimag kharb ho gayaa... please give any solution after installing ptipython , when i open scrapy shell got same error

@razzius
Copy link

razzius commented Nov 23, 2021

It'd be great to fix the root issue, but for each session I found the following diabled the log spam: import logging; logging.getLogger().setLevel(logging.INFO)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants