New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

wx.py.shell may not show the auto-completion popup in python3 #889

Closed
tianzhuqiao opened this Issue Jun 20, 2018 · 1 comment

Comments

Projects
None yet
2 participants
@tianzhuqiao

tianzhuqiao commented Jun 20, 2018

wx.py.shell may not show the auto-completion popup in python3

  1. run demo/PyShell.py
  2. in pyshell window, type
>>> import wx
>>> foo(wx.
  1. in python3, no auto-completion popup window will show as in python2.

It seems that the problem comes from the getTokens function in wx/py/introspect.py ~line 328

def getTokens(command):
    """Return list of token tuples for command."""

    # In case the command is unicode try encoding it
    if isinstance(command,  string_types):
        try:
            command = command.encode('utf-8')
        except UnicodeEncodeError:
            pass # otherwise leave it alone

    f = BytesIO(command)
    # tokens is a list of token tuples, each looking like:
    # (type, string, (srow, scol), (erow, ecol), line)
    tokens = []
    # Can't use list comprehension:
    #   tokens = [token for token in tokenize.generate_tokens(f.readline)]
    # because of need to append as much as possible before TokenError.
    try:
        if not PY3:
            def eater(*args):
                tokens.append(args)
            tokenize.tokenize_loop(f.readline, eater)
        else:
            tokens = list(tokenize.tokenize(f.readline))
    except tokenize.TokenError:
        # This is due to a premature EOF, which we expect since we are
        # feeding in fragments of Python code.
        pass
    return tokens

After replacing the line tokens = list(tokenize.tokenize(f.readline)) with following lines, the problem goes away.

            for t in tokenize.tokenize(f.readline):
                tokens.append(t)
@RobinD42

This comment has been minimized.

Member

RobinD42 commented Jun 24, 2018

Fixed by #894

@RobinD42 RobinD42 closed this Jun 24, 2018

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment