Skip to content

UA_String, signedness of UA_Byte and char #86

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Stasik0 opened this issue Jun 10, 2014 · 21 comments
Closed

UA_String, signedness of UA_Byte and char #86

Stasik0 opened this issue Jun 10, 2014 · 21 comments

Comments

@Stasik0
Copy link
Contributor

Stasik0 commented Jun 10, 2014

Hey folks,

in 9f7daee I pushed a small API for base64 handling that was taken from public domain (it is a bit hackly included, however the actual implementation if fully hidden). The 3rd party functions require char* as in and output. However we require everything in string and data to be UA_Byte*. The problem is that UA_Byte is explicitly unsigned while char's signess is undefined. Together with our current warning combinations it results in following codelines:

base64_decode_block((void*)(char*)(base64EncodedData->data), base64EncodedData->length, (void*)(char*)target, &state);

or

UA_String encodedString = {12, (UA_Byte*)"b3BlbjYyNTQx"};

which is IMHO a bit painful.

So thinking of integration of 3rd party applications with our stuff, I have following questions:

  • what are the profits of having UA_Byte to be explicitly unsigned? It seems not to make any difference, since (void*)(char*) casts work
  • how can we make our generic data type UA_Byte to be compatible with the 3rd party generic datatype char? UA_Byte vs. UA_SByte?

-Sten

@jpfr
Copy link
Member

jpfr commented Jun 10, 2014

Hmm. Good point.
If we add the compiler flag -funsigned-char, can we then inline-define strings without casting?
(http://stackoverflow.com/questions/2054939/is-char-signed-or-unsigned-by-default)

@Stasik0
Copy link
Contributor Author

Stasik0 commented Jun 10, 2014

It seems to be compiler or even machine-dependend... using -funsigned-char will also allow to do a single cast to (char*) and not to (void*)(char*), however it will probably make it more complex to port to other compilers. I'd take is as the last option (if there are other ones).

@uleon
Copy link
Contributor

uleon commented Jun 10, 2014

IEC62541 defines Byte as 8bit, unsigned and a ByteString as an array of Bytes. Therefore I decided a while ago that this should be an unsigned char despite the fact that most c-library functions have char arguments and need a cast anyways.
So I would prefer to leave it as is...

@Stasik0
Copy link
Contributor Author

Stasik0 commented Jun 10, 2014

Yea, but you are okay with changing the default signess of char in gcc?

Am 10.06.2014 um 18:40 schrieb uleon notifications@github.com:

IEC62541 defines Byte as 8bit, unsigned and a ByteString as an array of Bytes. Therefore I decided a while ago that this should be an unsigned char despite the fact that most c-library functions have char arguments and need a cast anyways.
So I would prefer to leave it as is...


Reply to this email directly or view it on GitHub.

@jpfr
Copy link
Member

jpfr commented Jun 10, 2014

The default signedness of char is undefined.
Afaik this can be over-specified to be unsigned in all compilers.
So adding this should not make us incompatible and/or platform dependent.

@Stasik0
Copy link
Contributor Author

Stasik0 commented Jun 10, 2014

do you mean to over-define it in headers?

@jpfr
Copy link
Member

jpfr commented Jun 10, 2014

Can it be done with headers only?

The easy way out would be to replace typedef uint8_t UA_Byte; with typedef char UA_Byte; and live with the ambiguity. Though that's kinda icky as 62541 is explicit about the signedness.

Afaik this can be over-specified to be unsigned in all compilers.

http://www.scriptol.com/scripts/compilers-options.php

@uleon
Copy link
Contributor

uleon commented Jun 10, 2014

I'd opt for keeping UA_Byte as uint8. I can live with a handfull of strange casts -- which are perfectly documented with this thread ;-)

@Stasik0
Copy link
Contributor Author

Stasik0 commented Jun 10, 2014

I opt in for keeping UA_Byte as uint8_t and to go for a well-documented compiler flag - to half the number of casts.

Am 10.06.2014 um 23:24 schrieb uleon notifications@github.com:

I'd opt for keeping UA_Byte as uint8. I can live with a handfull of strange casts -- which are perfectly documented with this thread ;-)

Reply to this email directly or view it on GitHub.

@jpfr
Copy link
Member

jpfr commented Jun 10, 2014

👍

@uleon
Copy link
Contributor

uleon commented Jun 11, 2014

@Stasik0 👍

@jpfr
Copy link
Member

jpfr commented Jun 12, 2014

The base64 code breaks with -funsigned-char. :-(

@Stasik0
Copy link
Contributor Author

Stasik0 commented Jun 12, 2014

Damn, I'd assume that we will habe similar problems with crypto libraries... What do we do? Just nothing?

Am 12.06.2014 um 11:13 schrieb Julius Pfrommer notifications@github.com:

The base64 code breaks with -funsigned-char. :-(


Reply to this email directly or view it on GitHub.

@jpfr
Copy link
Member

jpfr commented Jun 12, 2014

Crypto will not be problematic as long as it is compiled with their own makefiles.

How about this?
https://github.com/raldenhoven/lgpl-ffmpeg/blob/master/source/libavutil/base64.h
https://github.com/raldenhoven/lgpl-ffmpeg/blob/master/source/libavutil/base64.c

@Stasik0
Copy link
Contributor Author

Stasik0 commented Jun 12, 2014

I am currently not in aachen, so i can look into it only on weekend. Let it as is and let me some time until saturday, okay? The outer api of base64 as used in tests will not change...

Am 12.06.2014 um 11:35 schrieb Julius Pfrommer notifications@github.com:

Crypto will not be problematic as long as it is compiled with their own makefiles.

How about this?
https://github.com/raldenhoven/lgpl-ffmpeg/blob/master/source/libavutil/base64.h
https://github.com/raldenhoven/lgpl-ffmpeg/blob/master/source/libavutil/base64.c


Reply to this email directly or view it on GitHub.

@jpfr
Copy link
Member

jpfr commented Jun 12, 2014

ok, sure.

@Stasik0
Copy link
Contributor Author

Stasik0 commented Jun 14, 2014

@jpfr taking ffmpeg will force us to give up static linking exception :/ I fixed the problematic code in base64 and it seems to work, however declaring strings like
UA_String encodedString = {12, "b3BlbjYyNTQx"};
is still not possible even with -funsigned-char how does it come?

@jpfr
Copy link
Member

jpfr commented Jun 14, 2014

Hmm. There's some funny stuff in the spec. -funsigned-char does not relate to pointers.
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=28912

Works with an additional flag. Is this still ok?
8404438

@jpfr
Copy link
Member

jpfr commented Jun 14, 2014

Oh hey. It works with just -Wno-pointer-sign and no -funsigned-char.

@Stasik0
Copy link
Contributor Author

Stasik0 commented Jun 14, 2014

okay fine for me as it is

@Stasik0 Stasik0 closed this as completed Jun 14, 2014
@lock
Copy link

lock bot commented Jul 20, 2019

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@lock lock bot locked as resolved and limited conversation to collaborators Jul 20, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants