New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
int(u"\u1234") raises UnicodeEncodeError #37601
Comments
In python 2.2, int of a unicode string containing >>> int(u"\u1234")
Traceback (most recent call last):
File "<stdin>", line 1, in ?
UnicodeEncodeError: 'decimal' codec can't encode
character '\u1234' in position 0: invalid decimal
Unicode string
>>> I think it's important that int() of a string or |
Logged In: YES I don't see the problem: >>> try:
... int(u"\u1234")
... except ValueError:
... print "caught"
...
caught
>>> issubclass(UnicodeEncodeError,ValueError)
True |
Logged In: YES Ah, thanks. Sorry. |
Logged In: YES PyUnicode_EncodeDecimal() is responsible for this change. |
Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.
Show more details
GitHub fields:
bugs.python.org fields:
The text was updated successfully, but these errors were encountered: