How many bits do we need to represent values up to x? For example, to represent values up to 10, we need 4 bits. To represent up to 169, we need 8 bits.
I wanted to try implementing this in various programming languages as an exercise. The program reads a number from the command line (in argv) and prints the number of required bits; that's it!