-
-
Notifications
You must be signed in to change notification settings - Fork 741
Issue 5543 - conv.to should convert char to integral/floating-point as a single-char string #1017
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Maybe add a failing conversion test? assertThrown!ConvException(to!int('a')); |
@monarchdodra: Good idea, thanks. |
Hum. This conversion scheme is currently a one way street. I don't think it is acceptable to introduce assert(to!char(9) == '9'); //Currently fails
assert(to!char(49) == '1'); //Currently passes Should be assert(to!char(9) == '9'); //Passes
assert(to!char(49) == '1'); //Fails |
Yeah, slipped my mind. There will still be a one-way street for |
floating-point as a single-char string
Really? Why? You'll have a tough time selling this to me if I can do this: assert(5.0.to!int().to!char() == '5'); |
I mean |
TBH this is a breaking change and I personally hardly find 100% clean cut to see to!int('2') to return 2 rather then ascii code. I suppose it's more adequate to add this kind of function to std.acsii and later to std.uni (as there are other codepoints that have numeric value). In Unicode world this is called the numeric value property of a codepoint. See e.g And certainly I don't expect floating point <--> codepoint as there is no such mapping anyway. |
I don't know. I tend to think that dealing in code units is low enough level that you don't really want to encourage it with stuff like this. If the person knows what they're doing, then casting is generally fine. If they don't, then this just helps them blow their foot off. But even with |
char already implicitly converts to int so you wouldn't use Anyway, if it should be put somewhere else or rejected I don't mind. |
@jmdavis Yeah I agree that converting to/from ASCII (or Unicode for that matter) should be handled by std.ascii / std.uni. At the lowest level we know that char, wchar, dchar, just store the Unicode number of the character, but it's better to indicate intent in source code by using std.ascii/std.uni wrappers for cast(int)ch and (ch-'0'), etc., say codePointIdx(ch) and numericValue(ch), respectively, or something along those lines. Makes code more readable and less prone to careless slips. |
Discussion moved here: http://d.puremagic.com/issues/show_bug.cgi?id=5543 |
http://d.puremagic.com/issues/show_bug.cgi?id=5543
Before:
After:
Essentially char conversion would behave as if it's a string of length 1.