New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Let an application adjust minimum key length check in HOTP/TOTP #2915

Closed
dholl opened this Issue May 20, 2016 · 13 comments

Comments

Projects
None yet
4 participants
@dholl
Copy link

dholl commented May 20, 2016

I hope this is the correct place to report this issue, if not please redirect me. :)

While trying to use TOTP in hazmat/primitives/twofactor/totp.py with secrets generated via google-authenticator, I encountered an incompatibility in key length check. I found that TOTP objects each contain an HOTP object for the *OTP work, and within hotp.py, there is a check in HOTP's constructor that verifies the supplied key length:

   if len(key) < 16:
        raise ValueError("Key length has to be at least 128 bits.")

The problem here is that the key is supplied to this class after decoding from base32, but the secrets generated by google-authenticator are 16 characters while in base32 format. After base32-decoding one of these google-authenticator secrets, the resulting key is only 10 bytes (80 bits) long, and thus fails hotp.py's check.

From looking up the algorithm requirements in RFC 4226, I conclude that google-authenticator is generator insufficiently long keys. Specifically, this text in section 4:

https://tools.ietf.org/html/rfc4226#section-4
R6 - The algorithm MUST use a strong shared secret. The length of the shared secret MUST be at least 128 bits. This document RECOMMENDs a shared secret length of 160 bits.

So, the cryptography project is including a correct check of at least 128 bits, but the base-32 encoded keys from google-authenticator contain only 80 bits.

This issue has also been noted here:
https://security.stackexchange.com/questions/45053/why-does-google-cripple-the-2fa-google-authenticator-pam-module

Would the cryptography project consider adding an additional argument to the TOTP and HOTP constructors to permit a local code implementation to adjust this minimum length check? This would be handy for application where compatibility with an existing google-authenticator deployment is considered "important enough" to downgrade the minimum key sizes...

Thank you,
David

@reaperhulk

This comment has been minimized.

Copy link
Member

reaperhulk commented May 20, 2016

Thanks for the detailed report, this is the right place.

Google Authenticator is, unfortunately, a very important source so I'm supportive of finding some way to do this. We could do:

class HOTP(object):
    MINIMUM_KEY_LENGTH = 16

    def __init__(self, key, length, algorithm, backend):
        if not isinstance(backend, HMACBackend):
            raise UnsupportedAlgorithm(
                "Backend object does not implement HMACBackend.",
                _Reasons.BACKEND_MISSING_INTERFACE
            )

        if len(key) < self.MINIMUM_KEY_LENGTH:
            raise ValueError(
                "Key length has to be at least {} bytes.".format(
                    self.MINIMUM_KEY_LENGTH
                )
            )

and changing MINIMUM_KEY_LENGTH would then allow this to work. That, however, would be mutating it for all future instantiations of HOTP (and is not consistent with the standard API ethos of this project).

Alternately, we could add optional args to the TOTP/HOTP constructors:

class TOTP(object):
    def __init__(self, key, length, algorithm, time_step, backend, enforce_seed_length=True):
class HOTP(object):
    def __init__(self, key, length, algorithm, backend, enforce_seed_length=True):

Unfortunately that messes with our argument order (backend typically goes last), but to maintain API compatibility that might be necessary.

@dholl

This comment has been minimized.

Copy link

dholl commented May 20, 2016

I would be fine with either implementation and thus defer to the project. For my own implementation whims, I'm inclined toward including the argument to the constructors instead of the static class member.

----- the following comments are only remotely related to the current discussion, in that it also concerns future consideration for adding more arguments to TOTP's constructor. -----

On the topic of including an additional constructor argument:
At the moment, I've kludged together my own code where I had just commented out the check entirely, but then I also added a "window" argument to the constructor akin to the WINDOW_SIZE parameter described in:
https://github.com/google/google-authenticator/blob/master/libpam/FILEFORMAT

Would support for WINDOW_SIZE be within the future scope of the project as well? (and if so, does that raise any other concerns for constructor argument lists, etc...?) At the moment, this isn't important since an application could achieve this already without changing totp.py, just by calling .verify() with time adjusted by multiples of time_step. --- but again in my own code, since .verify raises an exception to indicate failure, I found that wrapping .verify in such a manner became ugly --- for example, if the window is set to 3, then I'm calling .verify 3 times, with these values for time:
time - time_step.
time
time + time_step
and thus .verify is guaranteed to raise an exception at least 2 out of the 3 times. So, in hopes of achieving constant execution time and minimizing the try/except blocks in my wrapper, I just extended my own code .verify --- but for my own simplicity, I only implemented checks relevant to TOTP and not the simpler window case for HOTP...

Perhaps to allow future constructor argument scalability, would the library consider encouraging applications to pass all options explicitly using python's keyword=value convention? This could start by supplying defaults for the existing arguments. For example with TOTP, this might be:
length=6, algorithm=SHA1(), time_step=30, backend=default_backend(), window=3, enforce_seed_length=True, ...
thus key would be the only required value and passed first in the argument list (after self...)


Anyhow, thank you very much for considering allowing applications to relax the key length check!

@dholl

This comment has been minimized.

Copy link

dholl commented May 21, 2016

Alternatively, if the key length error was downgraded to a warning ("import warnings"), then application code could use filters (also from the warnings module) to selectively suppress the warning. The resulting code would be:

import warnings
...
    if len(key) < 16:
        warnings.warn("Key length has to be at least 128 bits.")

However, since the present behavior is to not break current applications expectations, the default behavior should probably remain at the present raised exception. Now as to how to permit an application to select between exceptions and warnings, I'll defer to your perspective on class variables versus constructor arguments. Here's an example using your class variable idea to illustrate:

...
KEY_LENGTH_WARNINGS_INSTEAD_OF_EXCEPTIONS = False
...
def __init__(self, key, length, algorithm, backend):
    ...
    import warnings
    if len(key) < 16:
        if self.KEY_LENGTH_WARNINGS_INSTEAD_OF_EXCEPTIONS:
            warnings.warn("Key length has to be at least 128 bits.")
        else:
            raise ValueError("Key length has to be at least 128 bits.")

Hmm, but now the code is becoming slightly more complex compared to the current library, so that is also a trade-off. hmm.... (still thinking...)

nah, nevermind. this warnings-via-a-switch idea would require that end-user applications perform 2 operations to permit weak keys for google-auth interop: set the flag, and then filter the warnings.

@alex

This comment has been minimized.

Copy link
Member

alex commented May 30, 2016

Ok, I finally understand:

This is not google-authenticator the app, it's the PAM module.

I sent a PR to them increasing the key size: google/google-authenticator#556

@dholl

This comment has been minimized.

Copy link

dholl commented May 30, 2016

Yeah, the google-authenticator app and PAM module are a mess, IMHO. (details below to support my opinion) I was looking to use this (pyca's) cryptography library to provide backward compatibility with my existing users' inadequate google-authenticator OTP secrets, while I develop an upgrade path to stronger tokens.

(The rest of this can be skipped, since it isn't really relevant to the TOTP/HOTP implementation in the python cryptography library. It just explains why I was considering this library in the first place.)

IMHO -- Why google authenticator's OTP implementation is a mess:
Perhaps "mess" is too harsh, but a more accurate "suboptimal" just doesn't fetch enough attention.

Noting a few quotes from https://github.com/google/google-authenticator/wiki/Key-Uri-Format

Currently, the algorithm parameter is ignored by the Google Authenticator implementations.
Currently, the digits parameter is ignored by the Google Authenticator implementations.
Currently, the period parameter is ignored by the Google Authenticator implementations.

We derive that Google Authenticator implementations only support:

  • only SHA-1, not SHA256 or SHA-512
  • only 6 digits, not 8 (not 7, 8 or 9. 10 doesn't make sense for the algorithm.)
  • only 30 second time steps.

For a quick confirmation of this, the description of the pam module's data file format (~/.google_authenticator) does not include digits, algorithm, or step size:

Also, it doesn't help that the mobile app is closed-source which raises trust issues:

So while I haven't dug into their actual pam code base to compare against their documentation, in short, I've written off the whole google authenticator project as being a nice proof-of-concept which has been great to raise awareness. However, I postulate that the community would benefit better from other existing implementations. So at the moment, I'm just surveying python crypto libraries to satisfy my own goal of providing backward compatibility before bugging my users to upgrade their tokens.

@alex

This comment has been minimized.

Copy link
Member

alex commented May 30, 2016

Understood. I'd love to get a fix upstream, then adding a temporary
workaround which can be removed at a later date is much more palatable.

On Mon, May 30, 2016 at 2:18 PM, dholl notifications@github.com wrote:

Yeah, the google-authenticator app and PAM module are a mess, IMHO.
(details below to support my opinion) I was looking to use this (pyca's)
cryptography library to provide backward compatibility with my existing
users' inadequate google-authenticator OTP secrets, while I develop an
upgrade path to stronger tokens.

(The rest of this can be skipped, since it isn't really relevant to the
TOTP/HOTP implementation in the python cryptography library. It just
explains why I was considering this library in the first place.)

IMHO -- Why google authenticator's OTP implementation is a mess:
Perhaps "mess" is too harsh, but a more accurate "suboptimal" just
doesn't fetch enough attention.

Noting a few quotes from
https://github.com/google/google-authenticator/wiki/Key-Uri-Format

Currently, the algorithm parameter is ignored by the Google Authenticator
implementations.
Currently, the digits parameter is ignored by the Google Authenticator
implementations.
Currently, the period parameter is ignored by the Google Authenticator
implementations.

We derive that Google Authenticator implementations only support:

  • only SHA-1, not SHA256 or SHA-512
  • only 6 digits, not 8 (not 7, 8 or 9. 10 doesn't make sense for the
    algorithm.)
  • only 30 second time steps.

For a quick confirmation of this, the description of the pam module's data
file format (~/.google_authenticator) does not include digits, algorithm,
or step size:

https://github.com/google/google-authenticator/blob/master/libpam/FILEFORMAT

Also, it doesn't help that the mobile app is closed-source which raises
trust issues:

So while I haven't dug into their actual pam code base to compare against
their documentation, in short, I've written off the whole google
authenticator project as being a nice proof-of-concept which has been great
to raise awareness. However, I postulate that the community would benefit
better from other existing implementations. So at the moment, I'm just
surveying python crypto libraries to satisfy my own goal of providing
backward compatibility before bugging my users to upgrade their tokens.


You are receiving this because you commented.
Reply to this email directly, view it on GitHub
#2915 (comment),
or mute the thread
https://github.com/notifications/unsubscribe/AAADBBhYkIsMUeg9B8mOiCmDK_3Z0eWCks5qGyoSgaJpZM4IjV1g
.

"I disapprove of what you say, but I will defend to the death your right to
say it." -- Evelyn Beatrice Hall (summarizing Voltaire)
"The people's good is the highest law." -- Cicero
GPG Key fingerprint: D1B3 ADC0 E023 8CA6

@reaperhulk

This comment has been minimized.

Copy link
Member

reaperhulk commented Jun 2, 2016

@dholl Alex got a patch merged that resolves this against the PAM module. Could you explain a bit about when these secrets are generated and how often they're regenerated? We're curious how many users will have long-lived 80-bit secrets and whether we need to do anything else.

@dholl

This comment has been minimized.

Copy link

dholl commented Jun 3, 2016

From my experience with google-authenticator and its pam module, there is nothing requiring users to regenerate keys. The usage pattern is that those 80-bit keys get generated once, and then forgotten about as long as user tokens keep functioning properly. (FreeOTP, Google Authenticator, yubi keys...) The responsibility relies solely on having security-knowledgeable system administrators --- which are rare. (good idea: I'll add a time-stamp to my pam module's {H,T}]OTP data store so that I may permit administrators to include key roll-over reminder emails to end users.)

I think from a library perspective for the python cryptography library, having some mechanism where an application designer may customize the checks at their own peril (this is "hazmat" right?) should be sufficient. Heck, one idea might be to move the asserts to a separate class function. Then an application would be free free to monkey-patch the class or create derived classes to their heart's content... (thus creating toxic waste waste HAZerdous MATerials...) No other external API change would be necessary.

@reaperhulk

This comment has been minimized.

Copy link
Member

reaperhulk commented Jun 3, 2016

Would you say there's a significant prevalence of 80-bit keys in the wild? (I'm guessing yes, but want to archive that in this issue). If so, I think we should probably just lower the minimum length requirement in the code but (if we haven't) document that the RFC requires 128-bit or greater.

@dholl

This comment has been minimized.

Copy link

dholl commented Jun 3, 2016

Good question. From digging through their git log, their code has been creating 80-bit keys since March 26, 2010, commit 35ae38faf1a7a55b8d1808d893d2e4c4a1ef7824.

For reference, SECRET_BITS is 80 in this commit:
google/google-authenticator@35ae38f#diff-0eba36668d1bae54fe38a65a15e5026dR38

So, I assume that all existing installations have inadequate keys, unless local admins have modified their sources -- very unlikely.

@Ayrx

This comment has been minimized.

Copy link
Contributor

Ayrx commented Jun 11, 2016

I like the below API @reaperhulk suggested.

class TOTP(object):
    def __init__(self, key, length, algorithm, time_step, backend, enforce_seed_length=True):

While it does break existing conventions, it's probably the cleanest possible approach.

This however, should be weighed against the fact that if we do add it in, it'll most likely never be removed "at a later date" because there is no good way of judging how many 80-bit keys are in the wild.

@dholl

This comment has been minimized.

Copy link

dholl commented Jun 12, 2016

Recognizing the permanence of any API, how about moving all the input checks to a separate function, and allow callers to override that function as needed. This let's the hazmat code focus on being hazmat and permits these diaper checks to be caller-override-able. (Then the library doesn't have to get into the business of permitting 80-bit's just for Google Authenticator shenanigans.)

For totp.py:

def _totp_inputchecks(key, length, algorithm, time_step, backend):
    # move all current input checks/assertions out of __init__ and into this function:
    if not isinstance(backend, HMACBackend):
        raise UnsupportedAlgorithm(
            "Backend object does not implement HMACBackend.",
            _Reasons.BACKEND_MISSING_INTERFACE
        )
    # Now invoke hazmat.primitives.twofactor.hotp._hotp_inputchecks:
    from cryptography.hazmat.primitives.twofactor.hotp import _hotp_inputchecks
    _hotp_inputchecks(key, length, algorithm, backend)

class TOTP(object):
    def __init__(self, key, length, algorithm, time_step, backend, inputcheckfunc=None):
        if inputcheckfunc is None:
            _totp_inputchecks(key=key, length=length, algorithm=algorithm, time_step=time_step, backend=backend)
        else:
            # Invoke user supplied function with compatibility for both *args and **kwargs:
            inputcheckfunc(key=key, length=length, algorithm=algorithm, time_step=time_step, backend=backend)
        # Now proceed as if all inputs were verified:
        self._time_step = time_step
        # But when we invoke the base class constructor, override its inputcheckfunc with a no-op since we've already called _hotp_inputchecks above:
        self._hotp = HOTP(key, length, algorithm, backend, inputcheckfunc=lambda *args, **kwargs: pass)

And equivalently for hotp.py:

def _hotp_inputchecks(key, length, algorithm, backend):
    # move all current input checks/assertions out of __init__ and into this function:
    if not isinstance(backend, HMACBackend):
        raise UnsupportedAlgorithm(
            "Backend object does not implement HMACBackend.",
            _Reasons.BACKEND_MISSING_INTERFACE
        )

    if len(key) < 16:
        raise ValueError("Key length has to be at least 128 bits.")

    if not isinstance(length, six.integer_types):
        raise TypeError("Length parameter must be an integer type.")

    if length < 6 or length > 8:
        raise ValueError("Length of HOTP has to be between 6 to 8.")

    if not isinstance(algorithm, (SHA1, SHA256, SHA512)):
        raise TypeError("Algorithm must be SHA1, SHA256 or SHA512.")

class HOTP(object):
    def __init__(self, key, length, algorithm, backend, inputcheckfunc=None):
        if inputcheckfunc is None:
            _hotp_inputchecks(key=key, length=length, algorithm=algorithm, backend=backend)
        else:
            # Invoke user supplied function with compatibility for both *args and **kwargs:
            inputcheckfunc(key=key, length=length, algorithm=algorithm, backend=backend)
        # Now proceed as if all inputs were verified:
        self._key = key
        self._length = length
        self._algorithm = algorithm
        self._backend = backend

Aside: It looks like hotp.py and totp.py include redundant checks for isinstance(backend, HMACBackend) ???

@reaperhulk

This comment has been minimized.

Copy link
Member

reaperhulk commented Jul 16, 2016

#3012 should resolve this.

@reaperhulk reaperhulk closed this Jul 16, 2016

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment