Can I get a vote from you guys? I've been digging into this, and I think there are essentially 3 ways it could be implemented:
Add an optional parameter to the item/parameter keyword (only used for STRING/BLOCK) that indicates that the default value is to be read as hex, octal, binary. This is a very safe approach, but it adds yet another complexity to the cmd/tlm defs.
Rely on some sort of indicator at the beginning of the default value string to indicate that the rest of the string is to be read as hex, octal binary (i.e. if the string starts with "0x", read the rest of it as hex). This would eliminate the need for a new cmd/tlm def param, but there is the potential for accidental side-effects.
If we pass the default value through an eval statement, we can make it could naturally handle the normal notation for hex/octal (i.e. "\xff\xff\xff\xff"). Of course, that also allows for arbitrary execution of code too--pretty unsafe. On the other hand, maybe we don't care too much about security at this point, since bad cmd/tlm definitions can wreck the system in a lot of ways?
Do you guys have a preference? Or do you see another way I'm missing?
The only items with default values are parameters. If a parameter is a BLOCK and the default is "0xABCD" then we translate that into binary. I think that makes the most sense to people. If the parameter is a STRING and the default is "0xABCD" we leave it as a string. Strings and strings and blocks are binary. So I vote 2 but only support BLOCK.
There are cases with STRING where you still need to be able specify hex. (CR/NL/TAB) for example. Using, 0x as a detection prefix is fine. It would be nice if the string was quoted "0x00", that it would not convert to binary, but unquoted 0x00 would.
No description provided.
The text was updated successfully, but these errors were encountered: