Skip to content


Subversion checkout URL

You can clone with
Download ZIP


Add ActiveSupport::KeyGenerator as a simple wrapper around PBKDF2 #6952

merged 2 commits into from

7 participants


As part of the changes for Rails4 we want to sort out all the cryptography related things we have in place right now. The first part of this will be to add PBKDF2 for key derivation, then we can switch the session store and cookies.signed to use derived keys instead of using the bare secret.

Similarly future functionality like encrypted cookies / sessions can utilize derived keys.

So before merging I want to get some feedback from @meder and @thaidn on the particulars of our implementation. In particular, what are suitable default values for iterations and key_size.

This is a pre-requisite for #3955 #5034 and possibly others.

Note to other committers: Please leave this one for me to merge

@NZKoz NZKoz was assigned

Once I have someone confirm we're on the right track, I'll expand this functionality to provide something like:


I may also rename the class because it's deriving not generating keys. At that stage I think we'll be able to work on #3955 and friends


Preferred default parameters:

  • iterations: should be > 10000. I use 100000 generally, but 16384 would be not bad at this moment. The key point is enough slowness and it depends on purposes.
  • salt: should have enough entropy. Length check would be good to have. 16 bytes or more?
  • derived key length: should be specified by caller b/c it depends on purpose. SHA1 length (160bit) would be good if you need a default.
  • Something#derive_key interface should have at least a parameter "salt". Just in case it's not a typo.

@emboss should confirm my thoughts. :)


iterations: should be > 10000. I use 100000 generally, but 16384 would be not bad at this moment. The key point is enough slowness and it depends on purposes.

I agree, it should be at least a 5 digits number. Although, when chosen too large, this can quickly become an attack vector for denial of service attacks. It largely depends on the infrastructure of how much it can handle - maybe chose a default like @nahi proposed, but make still have it configurable somehow?

salt: should have enough entropy. Length check would be good to have. 16 bytes or more?

PKCS#5 says that the salt should at least be 8 bytes, I typically also use the 16 bytes that nahi proposed. What is essential here is that these bytes must be chosen by a cryptographically secure random number generator, OpenSSL::Random or SecureRandom. Ideally a fresh, per-user salt should be used each time, to forbid any opportunity for precomputation right from the start. Normally an attacker would not know the salt, but it's better to design as if they did, which is plausible - think of bribed/rogue devs/admins etc.
This would however require the salt to be a part of the return value or the salt to be part of the interface, handed over by the user. The latter option has the disadvantage that users could still get things wrong with salt generation, though.

derived key length: should be specified by caller b/c it depends on purpose. SHA1 length (160bit) would be good if you need a default.

Yes, the security margin is tightly related to the underlying digest that was used. It can't exceed the output length of the digest - so in nahi's example of SHA-1, choosing more than 160 bit will for example not increase security. So it's probably best to choose the exact output length of the digest. Some doubt has been cast over SHA-1 recently, and the current recommendations are to use SHA-256 wherever possible. Either way, you can then conveniently choose the dk_len parameter as OpenSSL::Digest#digest_length.

One final note about timing attacks when comparing password hashes. It is argued from time to time on the web that one would have to use some "equal time" comparison methods to compare password hashes in order to not leak any information subject to timing attacks. This is a very delicate issue. I asked a cryptographer who I can absolutely trust on these matters and his advice was as follows: If the underlying hash function were ideal, this attack would be even harder than finding collisions or preimages. So in theory there's no need to worry. But, since current hash functions cannot be proven to be ideal, it doesn't hurt to do equal time comparison, better safe than sorry.

If you'd like to follow his advice, the best method that I see currently for doing equal time comparisons is to do something like this:

d = #some digest
h1 = #first password hash
h2 = #second password hash
if d.digest(h1) == d.digest(h2)

Execution time of computing a digest is independent of its input. This method was also proposed in Dan Boneh's cryptography course on Coursera, so I believe we can trust it. The issue with other popular methods (keeping a "running sum" of XORs of the individual bits) besides being sufficiently complicated is often that optimizing compilers might one day "see what we did there" and would "optimize" our efforts away :)


Execution time of computing a digest is independent of its input.

To clarify: It is now possible to time the resulting digests of course. But this cannot leak any information about the original values if we use a cryptographically secure digest in the process.


To clarify the purpose here, we're not looking at using this for passwords. The bcrypt gem does that much more transparently and simply than we could here.

The goal is instead to let us derive multiple keys from a single application secret.

Currently we have:

config.secret_token = "af60a5492cabefa4f84707919d84354c5f7a7cfcd978f977ff16af4aaea4678b14b48f705f6cff7cd86706fd837dd9e2d00fb06e9bd74fa2e615f8a1787faac1"

In every generated app, we use that for signing cookies and the cookie store. However we want to make it possible to ship an encrypted cookie store, which requires an encryption key and a secret.

We also want to make it possible for users to generate their own MessageVerifiers and MessageEncryptors.

On the advice of the google security guys, we don't want to use the same secret as both the encryption key. As a result we need a deterministic way to generate additional keys based on the secret and another value. That's the intended use of the KeyGenerator here.

When we sign cookies, we won't use the raw secret_token, we'll instead use:

  cookie_secret = Application.key_generator.derive_key(some_salt)

The salts will either be generated when your app is generated, or perhaps even just defaulting to static values like "cookie hmac secret", "cookie encryption key".

So, is that a bit clearer? Additionally, does that plan make any sense? ;)


@NZKoz My comments and confirmations by @emboss are for key derivation, not directly for password authentication.

@emboss also stated about password authentication. It's also valid because password authentication is actually a key derivation; deriving authentication key (stretched password hash) from master key (password.) That said, I have no idea how Rails should use bcrypt and PBKDF2.

Ah, you meant that Application::Something#derive_key gets only salt to derive a new key because you're going to hide master key inside of Application, right? That would be fine.


@nahi exactly, take it as given that the secret is a securely generated master key which is secret. derive_key will be called with a salt which is either static, or configurable, for the sole purpose of deterministically generating keys to be used for other purposes in the application.

We can't use a random salt obviously, otherwise those keys are non-deterministic and we can't then re-validate cookies signed or encrypted with them.

Questions of timing attacks etc are moot as this only generates the keys, existing code in MessageVerifier does the hmac validation and that won't be changing.


@NZKoz Yeah, sorry for the confusion - I was under the impression that you were trying to replace bcrypt by PBKDF2. I should've read the title, too, not only the conversation ;) But as @nahi said, the same advice for PBKDF2 still holds for key derivation except for the comparison part because you no longer need to compare things.

A couple of questions about the master key. How is it derived? Is it some form of external input (a password?) or would it be generated within the application itself?

So that I get it right this time: you want to

  1. Create a master key
  2. Derive "session keys" deterministically from that master key in order to encrypt cookies or other things when needed?

If so, how are you planning to do deterministic derivation? That is, based on what information would you derive the key for let's say cookie encryption?


The master key itself is generated by SecureRandom when the application is generated. Users can change this as they wish.

The session key would be derived by providing another value as the salt to PBKDF2 with the master key as the 'password. The particular values would be probably be:

  key_generator =
  cookie_signing_key = key_generator.generate_key(config.cookie_signing_salt || "Cookie Signing Key")

We could remove the default values, but that makes upgrading a pain in the ass. Essentially the main goal I have is that we have a reliable way to generate keys by transforming the existing secret so that we don't end up using a single as hmac, encryption key, everywhere.

using pbkdf2 to derive keys seems like a sane way to avoid that while simultaneously keeping things pretty straightforward for users.


I also discussed the issue with Jean-Philippe Aumasson. It was my impression and he confirmed that PBKDF2 is not the optimal choice in your context. PBKDF2 was designed to mitigate the poor entropy of passwords, but as you confirmed you would be using a SecureRandom master secret right from the start, there is no need for a password-based derivation algorithm.

I thought of something simple at first, like KDF 1-4 on this page, but Jean-Philippe warned me that there are issues when deriving keys in that manner. To overcome these problems, he recommended HMAC as a good solution.

It turns out that there is a standard using HMAC for key derivation that provides all of the features you are looking for, HKDF. It also comes with a paper.

In your situation, where you have a SecureRandom master key, you could skip the "Extract" phase altogether, and immediately use the "Expand" phase. Determinism is supported by providing appropriate "info" strings (see also the comments in section 3.2) to

HKDF-Expand(PRK, info, L)

As the underlying hash function, I would recommend to use SHA-256. This solution has the advantage that there's no need for salt values at all since your initial key material is already secure pseudo-random.

On the downside, there's no implementation in stdlib available so far. But I wanted to add key derivation support to Ruby OpenSSL for quite some time now, because it's also important for key exchange protocols.

@nahi Would you think it's OK to add HKDF support to Ruby OpenSSL even if it's not directly supported by OpenSSL itself? My guess is we could implement it entirely in Ruby, probably no need for native code.

Alternatively, you could use this gem or we could discuss adding a reduced version ("Expand" only) to Rails directly. What do you think?


I'm incredibly hesitant to add a dependency on crypto code that's not part of OpenSSL, it's hard enough to use crypto correctly, implementing it's a touch too risky. If there's an implementation of HKDF in a later version of OpenSSL then we could make use of it there.

So you mention that PBKDF2 is not ideal due to our keys already having sufficient entropy, if the sole issues is that the derivation will take an unnecessarily long time, then I'm not worried about it. We'll be generating a handful of keys per application, and doing it once. Are there additional concerns with our using pbkdf2 in this way?


Valid points - and nahi confirmed that we would only add HKDF support to Ruby OpenSSL if OpenSSL itself supports it one day.

Regarding PBKDF2, it's always a bit tricky if an algorithm is used in slightly different ways than intended.

I also do believe that the iteration count should be no concern. My main concern is rather how to handle the salts and how to get the determinism without weakening anything.

A different question - will the master key be (re-)generated on startup (and only kept in memory) or is it read from (and stored to) an external location? If so, are there any measures to protect it there or is this left to the user?


Sorry, real life intervened.

The master key is generated when the application is generated and it's up to the user to protect it beyond that. We have a task rake secret, but all that does is print a new random secret out to stdout. There's no KeyCzar style functionality for rotating those keys. However users can simply change it and rails will silently discard the old cookies / sessions. Previously it has raised exceptions however you get lots of false positives with mis-behaving bots and proxies truncating the headers.

My plan for the salts is that they're config values to the application with hard coded defaults. f.ex to sign cookies we may have a config value

  config.something.cookie_signature_salt = "7da55e95cab4feae54f0a8af22a4e0a1280b63b5d5055948066"

But if the value's not set by the user, simply default it to some known value like "org.rubyonrails.cookies.signature"?


Using a known salt is about as useful as using no salt, no?


@rkh In this case it doesn't add to the security, I would agree. But using a salt would make it possible to derive "sub keys" from the master key in a reproducible way.

Sorry, real life intervened.

No problem, it keeps doing that sometimes ;)

The salt itself can be treated as if it were public once established, but to prevent the ability to precompute values it should be selected randomly at some point. Instead of hardcoding them once and for all, would it be possible to set the individual values for cookie signing etc. at application generation time, together with the master key?

This part from PKCS#5 seems also good advice:

2) Otherwise, the salt should contain data that explicitly
distinguishes between different operations and different key
lengths, in addition to a random part that is at least eight
octets long, and this data should be checked or regenerated by
the party receiving the salt. For instance, the salt could have
an additional non-random octet that specifies the purpose of
the derived key. Alternatively, it could be the encoding of a
structure that specifies detailed information about the derived
key, such as the encryption or authentication technique and a
sequence number among the different keys derived from the
password. The particular format of the additional data is left
to the application.

Taking this into account, how about prefixing the salt with the purpose and then adding a random part? Something like


The raw, (non-hex) salt should still be at least 8 bytes long, to prevent generating the same key twice for different applications:

2) It is unlikely that the same key will be selected twice.
Again, if the salt is 64 bits long, the chance of "collision"
between keys does not become significant until about 2^32 keys
have been produced, according to the Birthday Paradox. This
addresses some of the concerns about interactions between
multiple uses of the same key, which may apply for some
encryption and authentication techniques.

Since the original master key is already secure random, I would assume it's fine to stay modest with the iterations...

@nahi Would you agree?


What we want is to provide a simple API for the developers (including Rails' core ones) to derive different keys from one single master key. I'd like to see something as simple as Application.key_generator.derive_key(info), where info might contain some non-random data to identify the derived key, e.g., "encrypted_cookie_store_key".

One simple way you can do is to use with SHA256 as the hash function and pass the arguments as follows:

  • pass: the master key

  • salt: info that passed by the caller

  • iter: 1000. We don't need anything bigger here because the master key is already random.

  • keylen: 256 (= hashlen). This is enough key bits for all crypto operations.

This is actually a misuse of the OpenSSL-PBKDF2 API, but we really know what we are doing here for the following reasons:

1) In this settings PBKDF2-HMAC is very similar to HKDF. When keylen is equal to hashlen, HKDF would output T_1 computed as follows:

T_1 = HMAC-Hash(PRK, info | 0x01)

and PBKDF2-HMAC would output this T_1

F (P, S, c, i) = U_1 \xor U_2 \xor ... \xor U_c
T_1 = F (P, S, c, 1) ,


U_1 = PRF (P, S || INT (i)) ,
U_2 = PRF (P, U_1) ,
U_c = PRF (P, U{c-1}) .

Here, S is salt (non-random info in our case), c is the iteration counter and INT (i) is a four-octet encoding of the integer i, most significant octet first. You probably notice that HKDF's T_1 is actually PBKDF2-HMAC's U_1. In other words, the 1000 iteration count actually makes PBKDF2 stronger than HKDF.

2) The KDF used in SSL is super simple:

 key_block =
   MD5(master_secret + SHA(`A' + master_secret +
                           ServerHello.random +
                           ClientHello.random)) +
   MD5(master_secret + SHA(`BB' + master_secret +
                           ServerHello.random +
                           ClientHello.random)) +
   MD5(master_secret + SHA(`CCC' + master_secret +
                           ServerHello.random +
                           ClientHello.random)) + [...];

but it's Still Secure After All These Years (TM)!

3) Actually in practice people also use something as simple as HMAC(master_key, "0") and HMAC(master_key, "1") to derive different keys. It's okay because HMAC is a secure PRF, so as long as the master_key is random this would generate complete random keys.

So I guess we'll be fine with the approach I propose above. Anything more complex than it is probably overkill and might confuse developers.

@NZKoz NZKoz Add ActiveSupport::KeyGenerator as a simple wrapper around PBKDF2
This will be used to derive keys from the secret and a salt, in order to allow us to
do things like encrypted cookie stores without using the secret for multiple
purposes directly.

OK, I've updated this pull request as per @thaidn's helpful feedback, barring any objections in the next 24 hours I'll merge this in for 4.0 and begin the work of changing cookie / session store to derive the keys rather than using the bare secret.

@NZKoz NZKoz Provide access to the application's KeyGenerator
Available both as an env entry for rack and an instance method on Rails::Application for other uses
@NZKoz NZKoz merged commit 0a50792 into rails:master
@steveklabnik steveklabnik referenced this pull request

encrypted cookie jar #5034

@benja83 benja83 referenced this pull request from a commit in benja83/toyotakataboard-relational
@benja83 benja83