Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ECIP-1049: Change the ETC Proof of Work Algorithm to Keccak256 #13

Open
p3c-bot opened this issue Jan 16, 2019 · 7 comments

Comments

@p3c-bot
Copy link
Contributor

commented Jan 16, 2019

Change the ETC Proof of Work Algorithm to Keccak256

ECIP: 1049
Title: Change the ETC Proof of Work Algorithm to Keccak256
Status: Draft
Type: Network
Discussion: https://github.com/ethereumclassic/ECIPs/issues/13
Author: Alexander Tsankov (alexander.tsankov@colorado.edu)
Created: 2019-01-08

Abstract

A proposal to replace the current Ethereum Classic proof of work algorithm with Keccak-256.

Motivation

  • A response to the recent double-spend attacks against Ethereum Classic. Most of this hashpower was rented or came from other chains, specfically Ethereum (ETH). A seperate proof of work algorithm would encourage the development of a specialized Ethereum Classic mining community, and blunt the ability for attackers to purchase mercenary hash power on the open-market.

  • As a secondary benefit, deployed smart contracts and dapps running on chain are currently able to use keccak256() in their code. This ECIP could open the possibility of smart contracts being able to evaluate chain state, and simplify second layer (L2) development.

Rationale

Reason 1: Similarity to Bitcoin

The Bitcoin network currently uses the CPU-intensive SHA256 Algorithm to evaluate blocks. When Ethereum was deployed it used a different algorithm, Dagger-Hashimoto, which eventually became Ethash on 1.0 launch. Dagger-Hashimoto was explicitly designed to be memory-intensive with the goal of ASIC resistance [1]. It has been provably unsuccessful at this goal, with Ethash ASICs currently easily availalble on the market.

Keccak256 (aka SHA3) is the product of decades of research and the winner of a multi-year contest held by NIST that has rigorously verified its robustness and quality as a hashing algorithm. It is one of the only hashing algorithms besides SHA256 that is allowed for military and scientific-grade applications, and can provide sufficient hashing entropy for a proof of work system. This algorithm would position Ethereum Classic at an advantage in mission-critical blockchain applications that are required to use provably high-strength algorithms. [2]

A CPU-intensive algorithm like Keccak256 would allow both the uniqueness of a fresh PoW algorithm that has not had ASICs developed against it, while at the same time allowing for organic optimization of a dedicated and financially commited miner base, much the way Bitcoin did with its own SHA256 algorithm.

If Ethereum Classic is to succeed as a project, we need to take what we have learned from Bitcoin and move towards CPU-hard PoW algorithms.

At first, most users would run network nodes, but as the network grows beyond a certain point, it would be left more and more to specialists with server farms of specialized hardware. - Satoshi Nakamoto (2008-11-03) [3]

Note: Please consider this is from 2008, and the Bitcoin community at that time did not differentiate between node operators and miners. I interpret "network nodes" in this quote to refer to miners, and "server farms of specialized hardware" to refer to mining farms.

Reason 2: Value to Smart Contract Developers

In Solidity, developers have access to the keccak256() function, which allows a smart contract to efficiently calculate the hash of a given input. This has been used in a number of interesting projects launched on both Ethereum and Ethereum-Classic. Most Specifcally a project called 0xBitcoin [4] - which the ERC-918 spec was based on.

0xBitcoin is a security-audited [5] dapp that allows users to submit a proof of work hash directly to a smart contract running on the Ethereum blockchain. If the sent hash matches the given requirements, a token reward is trustlessly dispensed to the sender, along with the contract reevaluating difficulty parameters. This project has run successfully for over 10 months, and has minted over 3 million tokens [6].

With the direction that Ethereum Classic is taking: a focus on Layer-2 solutions and cross-chain compatibility; being able to evaluate proof of work on chain, will be tremendously valuable to developers of both smart-contracts and node software writers. This could greatly simplify interoperability.

Implementation

Work in Progress:

Example of a Smart contract hashing being able to trustlessly Keccak hash a hypothetical block header.
example

Here is an analysis of Monero's nonce-distribution for "cryptonight", an algorithm similar to Ethash, which also attempts to be "ASIC-Resistant" it is very clear in the picture that before the hashing algorithm is changed there is a clear nonce-pattern. This is indicative of a major failure in a hashing algorithm, and should illustrate the dangers of disregarding proper cryptographic security. Finding a hashing pattern would be far harder using a proven system like Keccak:

example

Based on analysis of the EVM architecture here there are two main pieces that need to be changed:

  1. The Proof of work function needs to be replaced with Keccak256
  2. The Function that checks the nonce-header in the block needs to know to accept Keccak256 hashes as valid for a block.

example

After doing further analysis it the best way forward to begin work is to implement this change in Multi-Geth instead of any other client. This is because Multi-geth is organized for multi-chain development, it seems to be more recently updated than classic-geth, and it is designed to be used with alternative consensus methods- which is necessary for implementing ECIP-1049.

The area where most of the changes will be in multi-geth/consensus

Implementation

See this pull request to multi-geth for "Astor" a Keccak256 Ethereum Classic Testnet.

References:

  1. https://github.com/ethereum/wiki/wiki/Dagger-Hashimoto#introduction
  2. https://en.wikipedia.org/wiki/SHA-3
  3. https://satoshi.nakamotoinstitute.org/emails/cryptography/2/
  4. https://github.com/0xbitcoin/white-paper
  5. EthereumCommonwealth/Auditing#102
  6. https://etherscan.io/address/0xb6ed7644c69416d67b522e20bc294a9a9b405b31

Previous discussion from Pull request

example
example
example
example
example
example

@p3c-bot

This comment has been minimized.

Copy link
Contributor Author

commented Jan 27, 2019

Work has officially begun on Astor testnet - a reference implementation of an Ethereum Classic Keccak256 testnet. Any help is appreciated.

Astor Place Station in New York is one of the first subway stations in the city, and we plan the testnet to be resiliant, while also delivering far increased performance by changing out the overly complicated Ethash proof of work algorithm.

@realcodywburns

This comment has been minimized.

Copy link
Member

commented Feb 7, 2019

"I think the intent of this ECIP is to just respond with an ECIP because the ECIP knowingly isn't trying to solve the problems of the claimed catalyst (51 attack). ETC can change it's underwear in some way but it has to have some type of super power than 'just cause'. I reject." - @stevanlohja #8 (comment)

@Harriklaw

This comment has been minimized.

Copy link

commented Apr 4, 2019

First and most crucial question : Do we need an algo change? How an algo change could help us?For me there are two aspects that should be examined at the same time. The first one, is how much secure is the new POW vs the old one. As you nicely wrote,any well examined algo as keccak256 is both scientifically reviewed and as the successor of SHA2 has high propability to succeed as SHA2 did with bitcoin. This can be controversial tho, so this article can strengthen the pros of keccac it is considered that may be quantum resistant: https://eprint.iacr.org/2016/992.pdf
"Our estimates are by no means a lower bound, as they are based on a series
of assumptions. First, we optimized our T-count by optimizing each component
of the SHA oracle individually, which of course is not optimal. Dedicated opti-
mization schemes may achieve better results. Second, we considered a surface
code fault-tolerant implementation, as such a scheme looks the most promising
at present. However it may be the case that other quantum error correcting
schemes perform better. Finally, we considered an optimistic per-gate error rate of about 10^−5
, which is the limit of current quantum hardware. This number will
probably be improved in the future. Improving any of the issues listed above will
certainly result in a better estimate and a lower number of operations, however
the decrease in the number of bits of security will likely be limited"
The second aspect we should examine is how the algo change will influence decentralization and this topic is more controversial. As economics are the most decesive factor for ASIC development ,(assuming that ETC will be valuable ),that will lead to new asics very soon. For me the real question is : how soon? And the answer is clearly hypothetical. Why this is a crucial question? First of all if already asics exist that would be unfair and centralized for the interval that new companies find and evolve their own heardware. If this is not the case, companies that already produce sha2 and other CPU intensive algos asics will eventually produce sha3 very fast as they already have the "know how " and have learnt how to adapt in this hardware/algo chase game very well. But do we want that? Do we want big asic companies to have the upper hand on ETC mining hardware production?If we accept decentralization is already well established among the crypto hardware industry( meaning asic companies)and many companies already joined the space ,then decentralization for sha3 will be achieved soon. But if we accept that gpu industry is a better space for our community (for decentralization purposes) then we should consider that any kind of algo change to cpu intensive algo will provoke massive change to our miners and mining ecosystem. Ethash compared to keccak is memory intensive ,and gpus are pretty much compatitive to asics right now: 1)efficiency: rx 580 =3.33 w/mh and a10= 1,75, 2) 2)price : rx 580 =150$ ( 5$/mh) and a10= 5600$ ( 11$/ mh)
So the real question is pretty much equal to this: cpu intensive vs memory intensive?gpus+ asics or asics? btc or etc is more decentrilized ? I think as for now gpus+ asics in ethash ecosystem make a helaty environement for decentralization hash power. Although btc seems to be well decentralized too.
Conclusion: for me an algo change will be profitable long term as keccak256 seems to be superior than Ethash in terms of security. Nevertheless, ethash seems to be superior in terms of decentralization. Short term we should consider other ways to reduce the risk for a future "51% attack" and allow the crypto mining industry to mature more. That would lead to a more decentralized mining hardware industry and consort with our optimal mining vision of a better decentralized eco.

@p3c-bot

This comment has been minimized.

Copy link
Contributor Author

commented Apr 4, 2019

Thank you for your post @Harriklaw. The plan for this switch is to create a SHA3 testnet first, for miners and hardware manufacturers to use, become comfortable with, and collect data on. Once we start seeing Flyclients, increased block performance, and on-chain smart contracts that verify the chain's proof of work, the mining community will see the tremendous value of this new algorithm and support a change.

RE: decentralization. I consider Ethash to already be ASIC'd, and as ETC becomes more valuable it will be less possible to mine it from a GPU anyway. The concern is that right now, Ethash is so poorly documented, only 1 or 2 companies knows how to build a profitable ASIC for it. However, with SHA3, it is conceivable that new startups, and old players (like Intel, Cisco, etc.) would feel comfortable participating in the mining hardware market since they know the SHA3 standard is transparent, widely used, and has other uses beyond just cryptocurrency.

SHA3 has been determined to be 4x faster in hardware than SHA2, so it is conceivable an entirely new economy can be created around SHA3 that is different than SHA2, similar to how the trucking market has different companies than the consumer car market.

@saturn-network

This comment has been minimized.

Copy link

commented Apr 5, 2019

Re: Quantum resistance of hash functions

  1. By the time it is possible to build a quantum computer that can crack keccak256 (sha3) there will be another generation or two of hash functions (think sha4 and sha5).
  2. Elliptic curve cryptography in Ethereum's private/public keys (for the vast majority of cryptocurrencies, really, including ETH BTC ETC TRX...) will be cracked much sooner than that. Who cares about mining crypto when you can literally steal other people's money (i.e. steal Satoshi's bitcoin).

I do not think we should worry about quantum resistance in this ECIP.

@saturn-network

This comment has been minimized.

Copy link

commented Apr 5, 2019

@p3c-bot frankly, we might even see sha3 ASICs embedded in desktop and mobile processors. In fact, SHA256 already has optimized instructions on ARM and Intel. Chances of Ethash instructions in ARM and Intel are slim to none at this point.

@zmitton

This comment has been minimized.

Copy link
Contributor

commented Apr 14, 2019

In the process of creating an ETC FlyClient, I have run into major blockers that can be eliminated if 1049 (this ECIP) is adopted.

Basically verification right now, cannot be done without some serious computation. The main issue is Ethash requiring the generation of a 16mb pseudorandom cache. This cache changes about once a week, so verifying the full work requires doing it many times. I have touched many creative solutions to this, but I believe we are stuck at light-client verification taking at least 10 minutes on a phone.

By contrast, with this ECIP, plus FlyClient (ECIP-1055), Im confident full PoW can be done in less than 5 seconds. This would open the door to new UX design patterns.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
5 participants
You can’t perform that action at this time.