Skip to content

Commit

Permalink
feat(ref-imp): #766 - Implemented hashing of public key as reveal value
Browse files Browse the repository at this point in the history
  • Loading branch information
thehenrytsai committed Sep 3, 2020
1 parent 3609b9f commit 2b529f0
Show file tree
Hide file tree
Showing 33 changed files with 225 additions and 182 deletions.
18 changes: 1 addition & 17 deletions docs/protocol.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,6 @@

OFFICIAL SIDETREE SPECIFICATION HERE: https://identity.foundation/sidetree/spec/

![Sidetree System Overview](../www/diagrams/overview-diagram.png)

## Operation chaining of a DID
![DID Operation Chaining](../www/diagrams/operationChaining.png)


## DDoS Attack & Mitigation

Expand Down Expand Up @@ -36,15 +31,7 @@ Sidetree specification defines the following mechanisms to enable scaling, while

#### Proof of Fee

Each Sidetree transaction on the target chain is required to include a deterministic fee, based on the number of DID operations they seek to include via the on-chain transaction. The deterministic rules for the default configuration are still under discussion, but the following are roughly represent the direction under discussion:

1. Simple inclusion of a transaction in a block will enable the transaction writer to include a baseline of N operations
2. Any number of operations that exceed N will be subject to proof that a fee was paid that meets or exceeds a required amount, determined as follows:
1. Let the block range R include the last block the node believes to be the latest confirmed and the 9 blocks that precede it.
2. Compute an array of median fees M, wherein the result of each computation is the median of all transactions fees in each block, less any Sidetree-bearing transactions.
3. Let the target fee F be the average of all the values contained in M.
4. Let the per operation cost C be F divided by the baseline amount N.
3. To test the batch for adherence to the Proof of Fee requirement, divide the number of operations in the batch by the fee paid in the host transaction, and ensure that the resulting per operation amount exceeds the required per operation cost C.
Each Sidetree transaction on the target chain is required to include a deterministic fee, based on the number of DID operations they seek to include via the on-chain transaction.

#### One Operation per DID per Batch
Only one operation per DID per batch is allowed, this prevents the operation chain of any DID from growing at an intractable rate.
Expand All @@ -57,9 +44,6 @@ Sidetree specification defines the following mechanisms to enable scaling, while

The DID owner must reproduce and reveal the correct commitment value in the subsequent operation for the operation to be considered valid. In addition, each subsequent operation must also include the hash of the new commitment value(s) for the next operation. This scheme enables efficient dismissal of counterfeit operations without needing to evaluate signatures.

See [Sidetree REST API](#sidetree-rest-api) section for the schema used to specify reveal values and commitment hashes in each operation.


## Sidetree Client Guidelines
A Sidetree client manages the private keys and performs document operations on behalf of the DID owner. The Sidetree client needs to comply to the following guidelines to keep the DIDs it manages secure.

Expand Down
26 changes: 11 additions & 15 deletions lib/core/Resolver.ts
Original file line number Diff line number Diff line change
Expand Up @@ -201,27 +201,23 @@ export default class Resolver {
}

/**
* Constructs a single commit value -> operation lookup map by looping through each supported hash algorithm,
* hashing each operations as key, then adding the result to a map.
* Constructs a single commit value -> operation lookup map by hashing each operation's reveal value as key, then adding the result to a map.
*/
private async constructCommitValueToOperationLookupMap (nonCreateOperations: AnchoredOperationModel[])
: Promise<Map<string, AnchoredOperationModel[]>> {
const commitValueToOperationMap = new Map<string, AnchoredOperationModel[]>();

// Loop through each supported algorithm and hash each operation.
const allSupportedHashAlgorithms = this.versionManager.allSupportedHashAlgorithms;
for (const hashAlgorithm of allSupportedHashAlgorithms) {
for (const operation of nonCreateOperations) {

const operationProcessor = this.versionManager.getOperationProcessor(operation.transactionTime);
const revealValueBuffer = await operationProcessor.getRevealValue(operation);
const hashOfRevealValue = Multihash.hashThenEncode(revealValueBuffer, hashAlgorithm);
// Loop through each operation and add an entry to the commit value -> operations map.
for (const operation of nonCreateOperations) {
const operationProcessor = this.versionManager.getOperationProcessor(operation.transactionTime);
const multihashRevealValueBuffer = await operationProcessor.getMultihashRevealValue(operation);
const multihashRevealValue = Multihash.decode(multihashRevealValueBuffer);
const multihashOfRevealValue = Multihash.hashThenEncode(multihashRevealValue.hash, multihashRevealValue.algorithm);

if (commitValueToOperationMap.has(hashOfRevealValue)) {
commitValueToOperationMap.get(hashOfRevealValue)!.push(operation);
} else {
commitValueToOperationMap.set(hashOfRevealValue, [operation]);
}
if (commitValueToOperationMap.has(multihashOfRevealValue)) {
commitValueToOperationMap.get(multihashOfRevealValue)!.push(operation);
} else {
commitValueToOperationMap.set(multihashOfRevealValue, [operation]);
}
}

Expand Down
6 changes: 0 additions & 6 deletions lib/core/VersionManager.ts
Original file line number Diff line number Diff line change
Expand Up @@ -21,8 +21,6 @@ import VersionModel from '../common/models/VersionModel';
* The class that handles code versioning.
*/
export default class VersionManager implements IVersionManager, IVersionMetadataFetcher {
public allSupportedHashAlgorithms: number[] = [];

// Reverse sorted implementation versions. ie. latest version first.
private versionsReverseSorted: VersionModel[];

Expand Down Expand Up @@ -104,10 +102,6 @@ export default class VersionManager implements IVersionManager, IVersionMetadata
}
this.versionMetadatas.set(version, versionMetadata);
}

// Get and cache supported hash algorithms.
const hashAlgorithmsWithDuplicates = Array.from(this.versionMetadatas.values(), value => value.hashAlgorithmInMultihashCode);
this.allSupportedHashAlgorithms = Array.from(new Set(hashAlgorithmsWithDuplicates)); // This line removes duplicates.
}

/**
Expand Down
2 changes: 1 addition & 1 deletion lib/core/interfaces/ICas.ts
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ export default interface ICas {
/**
* Reads the content of the given address in CAS.
* @param maxSizeInBytes The maximum allowed size limit of the content.
* @returns The fetch result containg the content buffer if found.
* @returns The fetch result containing the content buffer if found.
* The result `code` is set to `FetchResultCode.MaxSizeExceeded` if the content exceeds the specified max size.
*/
read (address: string, maxSizeInBytes: number): Promise<FetchResult>;
Expand Down
4 changes: 2 additions & 2 deletions lib/core/interfaces/IOperationProcessor.ts
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ export default interface IOperationProcessor {
): Promise<DidState | undefined>;

/**
* Gets the reveal value of a non-create operation.
* Gets the multihash buffer used as the reveal value of a non-create operation.
*/
getRevealValue (operation: AnchoredOperationModel): Promise<Buffer>;
getMultihashRevealValue (operation: AnchoredOperationModel): Promise<Buffer>;
}
10 changes: 3 additions & 7 deletions lib/core/interfaces/IVersionManager.ts
Original file line number Diff line number Diff line change
Expand Up @@ -8,13 +8,9 @@ import ITransactionSelector from './ITransactionSelector';
* Defines an interface to return the correct 'version-ed' objects.
*/
export default interface IVersionManager {

/** All the supported hash algorithms. */
allSupportedHashAlgorithms: number[];

/**
* Gets the batchwriter for the given blockchain time.
* @param blockchainTime The blockchain time for which the batchwriter is needed.
* Gets the batch writer for the given blockchain time.
* @param blockchainTime The blockchain time for which the batch writer is needed.
*/
getBatchWriter (blockchainTime: number): IBatchWriter;

Expand All @@ -26,7 +22,7 @@ export default interface IVersionManager {

/**
* Gets the request handler for the given blockchain time.
* @param blockchainTime The blockchain time for which the requesthandler is needed.
* @param blockchainTime The blockchain time for which the request handler is needed.
*/
getRequestHandler (blockchainTime: number): IRequestHandler;

Expand Down
17 changes: 11 additions & 6 deletions lib/core/versions/0.9.0/OperationProcessor.ts
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ import JsonCanonicalizer from './util/JsonCanonicalizer';
import Multihash from './Multihash';
import Operation from './Operation';
import OperationType from '../../enums/OperationType';
import ProtocolParameters from './ProtocolParameters';
import RecoverOperation from './RecoverOperation';
import SidetreeError from '../../../common/SidetreeError';
import UpdateOperation from './UpdateOperation';
Expand Down Expand Up @@ -60,30 +61,34 @@ export default class OperationProcessor implements IOperationProcessor {
return appliedDidState;
}

public async getRevealValue (anchoredOperationModel: AnchoredOperationModel): Promise<Buffer> {
public async getMultihashRevealValue (anchoredOperationModel: AnchoredOperationModel): Promise<Buffer> {
if (anchoredOperationModel.type === OperationType.Create) {
throw new SidetreeError(ErrorCode.OperationProcessorCreateOperationDoesNotHaveRevealValue);
}

const operation = await Operation.parse(anchoredOperationModel.operationBuffer);

let revealValueBuffer;
let canonicalizedKeyBuffer;
switch (operation.type) {
case OperationType.Recover:
const recoverOperation = (operation as RecoverOperation);
revealValueBuffer = JsonCanonicalizer.canonicalizeAsBuffer(recoverOperation.signedData.recoveryKey);
canonicalizedKeyBuffer = JsonCanonicalizer.canonicalizeAsBuffer(recoverOperation.signedData.recoveryKey);
break;
case OperationType.Update:
const updateOperation = (operation as UpdateOperation);
revealValueBuffer = JsonCanonicalizer.canonicalizeAsBuffer(updateOperation.signedData.updateKey);
canonicalizedKeyBuffer = JsonCanonicalizer.canonicalizeAsBuffer(updateOperation.signedData.updateKey);
break;
default: // This is a deactivate.
const deactivateOperation = (operation as DeactivateOperation);
revealValueBuffer = JsonCanonicalizer.canonicalizeAsBuffer(deactivateOperation.signedData.recoveryKey);
canonicalizedKeyBuffer = JsonCanonicalizer.canonicalizeAsBuffer(deactivateOperation.signedData.recoveryKey);
break;
}

return revealValueBuffer;
// TODO: Issue #766 - Remove temporary assumption on reveal value being calculated using the same hash algorithm
// as the algorithm used by the protocol version when the operation is anchored.
const revealValueHashAlgorithm = ProtocolParameters.hashAlgorithmInMultihashCode;
const multihashRevealValueBuffer = Multihash.hash(canonicalizedKeyBuffer, revealValueHashAlgorithm);
return multihashRevealValueBuffer;
}

/**
Expand Down
2 changes: 1 addition & 1 deletion lib/core/versions/0.9.0/RequestHandler.ts
Original file line number Diff line number Diff line change
Expand Up @@ -201,7 +201,7 @@ export default class RequestHandler implements IRequestHandler {
/**
* Resolves the given long-form DID by resolving using operations found over the network first;
* if no operations found, the given create operation will be used to construct the DID state.
*
*
* @returns [DID state, published]
*/
private async resolveLongFormDid (did: Did): Promise<[DidState | undefined, boolean]> {
Expand Down
77 changes: 56 additions & 21 deletions lib/core/versions/latest/Multihash.ts
Original file line number Diff line number Diff line change
Expand Up @@ -14,12 +14,26 @@ export default class Multihash {
/**
* Hashes the content using the hashing algorithm specified.
* @param hashAlgorithmInMultihashCode The hashing algorithm to use. If not given, latest supported hashing algorithm will be used.
* @returns A multihash buffer.
*/
public static hash (content: Buffer, hashAlgorithmInMultihashCode?: number): Buffer {
if (hashAlgorithmInMultihashCode === undefined) {
hashAlgorithmInMultihashCode = ProtocolParameters.hashAlgorithmInMultihashCode;
}

const conventionalHash = this.hashAsNonMultihashBuffer(content, hashAlgorithmInMultihashCode);

const multihash = multihashes.encode(conventionalHash, hashAlgorithmInMultihashCode);

return multihash;
}

/**
* Hashes the content using the hashing algorithm specified as a generic (non-multihash) hash.
* @param hashAlgorithmInMultihashCode The hashing algorithm to use. If not given, latest supported hashing algorithm will be used.
* @returns A multihash buffer.
*/
public static hashAsNonMultihashBuffer (content: Buffer, hashAlgorithmInMultihashCode: number): Buffer {
let hash;
switch (hashAlgorithmInMultihashCode) {
case 18: // SHA256
Expand All @@ -29,19 +43,19 @@ export default class Multihash {
throw new SidetreeError(ErrorCode.MultihashUnsupportedHashAlgorithm);
}

const hashAlgorithmName = multihashes.codes[hashAlgorithmInMultihashCode];
const multihash = multihashes.encode(hash, hashAlgorithmName);

return multihash;
return hash;
}

/**
* Canonicalize the given content, then multihashes the result using the lastest supported hash algorithm, then encodes the multihash.
* Canonicalize the given content, then double hashes the result using the latest supported hash algorithm, then encodes the multihash.
* Mainly used for testing purposes.
*/
public static canonicalizeThenHashThenEncode (content: object) {
public static canonicalizeThenDoubleHashThenEncode (content: object) {
const contentBuffer = JsonCanonicalizer.canonicalizeAsBuffer(content);
const multihashEncodedString = Multihash.hashThenEncode(contentBuffer, ProtocolParameters.hashAlgorithmInMultihashCode);

// Double hash.
const intermediateHashBuffer = Multihash.hashAsNonMultihashBuffer(contentBuffer, ProtocolParameters.hashAlgorithmInMultihashCode);
const multihashEncodedString = Multihash.hashThenEncode(intermediateHashBuffer, ProtocolParameters.hashAlgorithmInMultihashCode);
return multihashEncodedString;
}

Expand All @@ -56,18 +70,22 @@ export default class Multihash {
}

/**
* Given a multihash, returns the code of the hash algorithm used.
* Given a multihash, returns the code of the hash algorithm, and digest buffer.
* @returns [hash algorithm code, digest buffer]
* @throws `SidetreeError` if hash algorithm used for the given multihash is unsupported.
*/
public static getHashAlgorithmCode (multihashBuffer: Buffer): number {
public static decode (multihashBuffer: Buffer): { algorithm: number, hash: Buffer } {
const multihash = multihashes.decode(multihashBuffer);

// Hash algorithm must be SHA-256.
if (multihash.code !== 18) {
throw new SidetreeError(ErrorCode.MultihashUnsupportedHashAlgorithm);
}

return multihash.code;
return {
algorithm: multihash.code,
hash: multihash.digest
};
}

/**
Expand Down Expand Up @@ -123,17 +141,38 @@ export default class Multihash {
}

/**
* Canonicalizes the given content object, then verifies the multihash against the canonicalized string as a UTF8 buffer.
* Canonicalizes the given content object, then verifies the multihash as a "double hash"
* (ie. the given multihash is the hash of a hash) against the canonicalized string as a UTF8 buffer.
*/
public static canonicalizeAndVerify (content: object | undefined, encodedMultihash: string): boolean {
public static canonicalizeAndVerifyDoubleHash (content: object | undefined, encodedMultihash: string): boolean {
if (content === undefined) {
return false;
}

try {
const contentBuffer = JsonCanonicalizer.canonicalizeAsBuffer(content);

return Multihash.verify(contentBuffer, encodedMultihash);
return Multihash.verifyDoubleHash(contentBuffer, encodedMultihash);
} catch (error) {
console.log(error);
return false;
}
}

/**
* Verifies the multihash as a "double hash" (ie. the given multihash is a hash of a hash) against the content `Buffer`.
* Note that the intermediate hash is required to be a non-multihash hash by the same hash algorithm as the final multihash.
*/
private static verifyDoubleHash (content: Buffer, encodedMultihash: string): boolean {

try {
const expectedMultihashBuffer = Encoder.decodeAsBuffer(encodedMultihash);
const hashAlgorithmCode = Multihash.decode(expectedMultihashBuffer).algorithm;

const intermediateHashBuffer = Multihash.hashAsNonMultihashBuffer(content, hashAlgorithmCode);
const actualMultihashBuffer = Multihash.hash(intermediateHashBuffer, hashAlgorithmCode);

return Buffer.compare(actualMultihashBuffer, expectedMultihashBuffer) == 0;
} catch (error) {
console.log(error);
return false;
Expand All @@ -146,16 +185,12 @@ export default class Multihash {
private static verify (content: Buffer, encodedMultihash: string): boolean {

try {
const multihashBuffer = Encoder.decodeAsBuffer(encodedMultihash);

const hashAlgorithmCode = Multihash.getHashAlgorithmCode(multihashBuffer);
const actualHashBuffer = Multihash.hash(content, hashAlgorithmCode);
const expectedMultihashBuffer = Encoder.decodeAsBuffer(encodedMultihash);
const hashAlgorithmCode = Multihash.decode(expectedMultihashBuffer).algorithm;

if (Buffer.compare(actualHashBuffer, multihashBuffer) !== 0) {
return false;
}
const actualMultihashBuffer = Multihash.hash(content, hashAlgorithmCode);

return true;
return Buffer.compare(actualMultihashBuffer, expectedMultihashBuffer) == 0;
} catch (error) {
console.log(error);
return false;
Expand Down

0 comments on commit 2b529f0

Please sign in to comment.