New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Password verification does not work on different machines. #172
Comments
The only thing I can think of is the salt is somehow mismatched, which would generate a different password hash. It could be an issue with @phc/format de/serialization. Can you share a little bit of the code around when you first hash and then where you compare/verify the hashes?
I am curious as to why that. |
Sure so this is when the hash is created:
And this is the verification:
The Re sha-256, this was originally an experiment. The actual communication with the backend is via WS with HTTPS so technically the hash does very little, but the idea is to evict the password from memory in plaintext straight away. In practice, an XSS attack or side-channel attack would render it a useless practice and the HTTPS provides the transport security anyway. |
While that should not happen, you should just hash the password straight away. Did you check if the concatenated stuff is exactly the same for both functions? Log SHAing it does not add any level of security (it effectively makes the hash become the password) and concatenating extra data is of no use. I guess you are trying to avoid rainbow tables there appending id and date, but argon2 is already protected against it by the salt. If you want to get rid of the password as soon as possible, delete it from objects right after hashing or verifying, there's not much else you can do. |
Hey, if you wish to incorporate additional data into the hash, I could add that to the library, as Argon2 supports it (see 2.) |
I agree, this was the idea with the experimentation of SHA-256. This is a built-in to the If you're wondering why I've developed an obsession with client-side hashing its because of a legal change in the UK and Europe generally. GDPR and general legal developments in the UK have meant that if a user has a password which is compromised from your database of passwords and that password is re-used to gain access and cause further damage to another service or a greater loss to the user who is re-using their password, you can be held liable for it. Not many people understand that GDPR requires us to protect all sensitive data using any and all means available, the exact extent of what it means by "secure" hasn't been legally tested yet in the UK, but most commentators, myself included, predict it will probably align with some of the other wording of GDPR and require the best available security measures to be used. Generally data considered sensitive is data which is about a "protected characteristic" such as gender, age, ethnic origin, disability etc. Previously, in the UK the Data Protection Act 1998 just required "appropriate organisational and technical measures" which essentially means HTTPS will do nicely. Now with GDPR it goes a little further and says "'Processed in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures'" and "‘Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk’". So in short, its now expected that you would reasonably protect something that could be used as a vector to attack something else such as a password if your server is hacked and the plaintext is stolen and re-used. This of course, is yet to be tested in court. You can read about it here though if you're interested in the scope of it: https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/security/ In terms of the combined data to be verified, this is what we send it across all servers, the hash is stored in a MongoDB database and remains constant. Given these are test values, I'll just share them to give you an example (in this case the password is the world's favourite: My local dev environment Argon2 hash
Verify comparator
Remote docker instance on kubernetes node: Argon2 hash
Verify comparator
So you were definately right about the additional data being different. It's a bit of a JS quirk I think because in the database the creationDate is: In regards to your suggestion: #172 (comment) . I think it's an awesome idea, it would be very nice to have a way of including additional data formally as part of the argon2 module. I'm more than happy to test as well if you like. |
Hello @walkerandco, branch ad should enable you to use associated data as per the spec I linked above. Just add an Let me know if I can improve something (maybe call it |
I think this is a very good idea @ranisalt, I think you would be better keeping it as |
v0.22.0 is published with the associated data support. |
Welcome to the issues section if it's your first time!
Before creating an issue, please be sure to:
Steps to reproduce
Expected behaviour
It should verify the password successfully.
Actual behaviour
argon2.verify return false, indicating the password does not match. This is incorrect, the passwords do match. Creating a new hash and verifiying it on the same Kubernetes node works, however if you try to verify that on another node (which obtains the hash from the same MongoDB database) it fails.
Environment
Operating system:
Nodes: CoreOS latest on GKE
Docker: node:latest
Node version:
v10.15.0
Compiler version:
Additional Information:
This is a docker image for a node application which is built in a Gitlab CI pipeline, a docker image is built and then deployed as part of a helm chart on GKE. Argon2 is a module within that application, used to verify password for user login.
The scheme is that the user enters a password which is immediately hashed with sha-256 when they press a key down (the unhashed version is never retained in the browser), this is then emitted over a https websocket connection to the backend where argon2 is available. The password is injected into a function which calls argon2.verify(). The hash is obtained from a MongoDB database, the password comparator is computed by combining the sha-256 hash, the objectID of the user (constant) and the creation timestamp of that user (constant).
When a user is created, the argon2 hash is stored in MongoDB and is accessible from all nodes. The objectID and creation timestamp are constants (because they are immutable). I have verified these are the same on all nodes for every call.
What appears to be happening is that argon2 is computing the verification differently on each machine. It is presently using this configuration:
$argon2i$v=19$m=4096,t=3,p=1
Any ideas why the computation would not be the same on every node, given that they are by nature exactly the same on GKE?
The text was updated successfully, but these errors were encountered: