Skip to content

Commit

Permalink
Buffer to Uint8Array cleanup (#2607)
Browse files Browse the repository at this point in the history
* `Buffer` to `Uint8Array` conversion (#2566)

* V7 update to master 1 (#2593)

* Added v7 release reference in main README table (#2562)

* common: Schedule shanghai on goerli (#2563)

* common: Schedule shanghai on goerli

* update timestamp

* util/tx: Shift ssz back to case dependency free ES2019 compatible version (#2564)

* util/tx: Shift ssz back to case dependency free ES2019 compatible version

* update package lock

* update karma ecma version

* VM: some optimization on the bnadd/bnmul precompiles to only copy over the necessary 128 bytes as input for the WASM call (#2568)

* EVM: Memory Fix & Other Optimizations (#2570)

* EVM: Rename evm debug logger to evm:evm (one for package, one for class), consistency, also, logger will otherwise be left out when run with evm:*

* VM: Rename message checkpoint to state checkpoint in debug message (there is a dedicated message checkpoint msg along msg logging)

* EVM: CALL/CREATE debug exit msg differentiation

* EVM: avoid buffer copy in memory read (performance)

* EVM: Rewrite runCall() checkpoint/revert conditional for readability/simplification

* EVM: Added EIP check for transient storage checkpointing

* EVM: Precompile Debug Logger Improvements (#2572)

* EVM: Added general precompile debug logger setup, first ECRECOVER exemplary debug logging

* EVM: Added remaining precompile debug loggers

* EVM: added error cases to BLS precompile debug log

* EVM: Added missing precompile return value debug logs

* Small fixes

* tx: ensure eip3860 txs can have more than max_initcode_size data if to field is non-empty (#2575)

* EVM: Avoid memory.read() Memory Copy (#2573)

* EVM: added avoidCopy parameter to memory.read() function, first test on CREATE opcode

* EVM: Add direct memory read to all calling opcodes

* EVM: Copy over memory on IDENTITY precompile

* EVM: remove length checks and return buffer 0-filling in Memory.read() (memory is uncoditionally being extended properly anyhow)

* Some optimizations

* blockchain: fix merge->clique transition (#2571)

* Client: ensure safe/finalized blocks are part of the canonical chain on forkchoiceUpdated (#2577)

* client/engine: ensure finalized/safe blocks are in canonical chain

* client: engine-api: fix finalized block check

* client/tests: fix forkchoice updated test

* client: add fcu tests to check if blocks are part of canonical chain

* client/engine: ensure payload has a valid timestamp forkchoiceUpdated (#2579)

* client/engine: ensure invalid blockhash response matches spec (#2583)

* client/engine: delete invalid skeleton blocks (#2584)

Co-authored-by: acolytec3 <17355484+acolytec3@users.noreply.github.com>

* Setup to dev/test snapsync with sim architecture (#2574)

* Setup to dev/test snapsync with sim architecture

* modfiy single-run to setup a lodestar<>geth node to snapsync from

* setup an ethereumjs inline client and get it to peer with geth

* cleanup setup a bit

* snapsync run spec

* get the snap testdev sim working

* finalize the test infra and update usage doc

* enhance coverage

* Use geth RPC to connect to ethJS

* refac wait for snap sync completion

---------

Co-authored-by: acolytec3 <17355484+acolytec3@users.noreply.github.com>

* client: Add safe and finalized blockoptions to the chain (#2585)

* client: Add safe and finalized blockoptions to the chain

* fix tests

* fix more tests

* fix remaining

* cleanup

* enhance coverage

* unset scheduled goerli timestamp based hfs colliding with test

* Client: Small Debug Helpers and CLI Improvements (#2586)

* Client: new constant MAX_TOLERATED_BLOCK_TIME for execution, added warning for slowly executed blocks

* Client -> Execution: NumBlocksPerIteration (default: 50) as an option

* Client: only restart RLPx server or log peer stats if max peers is set to be greater than 0

* Apply suggestions from code review

Co-authored-by: acolytec3 <17355484+acolytec3@users.noreply.github.com>

* Apply suggestions from code review

---------

Co-authored-by: acolytec3 <17355484+acolytec3@users.noreply.github.com>

* common: Schedule Shanghai on mainnet! (#2591)

* common: Schedule Shanghai on mainnet!

* clear hf timestamp for test

* VM: Diff-based Touched Accounts Checkpointing (#2581)

* VM: Switched to a more efficient diff-based way of touched account checkpointing

* VM: move accessed storage inefficient checkpointing problem to berlin, haha

* EVM: avoid memory copy in MLOAD opcode function

* Remove console.log() in EVM

* vmState: ensure touched accounts delete stack gets properly updated on commit

* vm/eei: save touched height

* vm/vmState: new possible fix for touched accounts

* vm/vmState: another attempt to fix touched accounts journaling

* vm: add journaling

* Check correct journal height on revert

---------

Co-authored-by: Jochem Brouwer <jochembrouwer96@gmail.com>
Co-authored-by: acolytec3 <17355484+acolytec3@users.noreply.github.com>

---------

Co-authored-by: Holger Drewes <Holger.Drewes@gmail.com>
Co-authored-by: g11tech <gajinder@g11.in>
Co-authored-by: Jochem Brouwer <jochembrouwer96@gmail.com>

* First pass - most util changes

* Fix most account stuff

* Fix account test

* Many byte fixes

* util: fix constants tests

* remaining fixes

* Turn off ci jobs

* monorepo: bigIntToUnpaddedBuffer -> bigIntToUnpaddedBytes

* util: update description of bytes exporT

* util: remove unused import

* common: use bytesToHex helper instead of toString('hex')

* trie: refactor non-test files to Uint8Array

* util: add binary string utils

* util: remove extra Uint8Array.from

* util: remove arrToBufArray util

* trie: adjust tests and fix outstanding issues

* util: remove binarystring utils and add compareBytes and randomBytes util

* common: refactor common with Uint8Array

* util: accept 0x-prefixed and non-prefixed hex strings in toBytes

* tx: refactor Buffer -> Uint8Array

* tx: remove unused import

* util: revert toBytes update

* block: refactor Buffer -> uint8array

* block: adjust import

* trie: refactor remaining buffer instances

* move devp2p to uint8Array

* statemanager: refactor buffer -> uint8array

* util: simplify zeros

* util: add concatBytesUnsafe

* ethash: partial migration

* ethash: update examples

* ethash: wip fixes

* ethash: more WIP

* ethash: ensure fnv input is read from mix, not mixView

* blockchain: migrate to uint8array

* ethash: renable all tests

* ethash: fix bytesReverse

* Fix miner tests

* many hexToBytes moves

* most of evm/vm moves

* evm: Fix all tests

* vm: more fixes

* More fixes

* vm: fix receipts encoding

* vm: fix tester

* client: refactor buffer to uint8array

* client: additional uint8 adjustments

* client: fix most tests

* fix remaining client unit tests

* reactivate most CI

* client: fix les test/protocol

* turn client CI on

* util: fix name typo

* lint

* Fix withdrawals

* remove buffarraytoarr

* Remove bufArrtoArr references

* Lint

* fix examples

* Fix difficulty test

* lint

* block: update randomBytes import

* replace randombytes import

* client: fix sim test util

* vm: fix example

* devp2p: update snappy typing and fix tests

* Fix tests

* Remove additional buffer references

* rustbn fixes

* add 0x prefix to precompile address

* Remove `node-ip` dependency and buffer references in devp2p

* Switch slice to subarray

* evm: fix blake2f

* Merge fixes

* more merge commit fixes

* more test fixes

* Address all the feedback

* fix dns test

* Update packages/util/src/bytes.ts

Co-authored-by: Jochem Brouwer <jochembrouwer96@gmail.com>

* Fix return type for baToJSON

* util: instantiate hexByByte array

* Remove baToJson

* rebase fixes

* Fix event typing

* Revert outdated initcode changes

* lint

---------

Co-authored-by: Holger Drewes <Holger.Drewes@gmail.com>
Co-authored-by: g11tech <gajinder@g11.in>
Co-authored-by: Jochem Brouwer <jochembrouwer96@gmail.com>
Co-authored-by: Gabriel Rocheleau <contact@rockwaterweb.com>

* Devp2p status fix

* Remove buffer detritus

* Switch db to view

* buffer cleanup

* Correctly parse heads from DB

* Fix encodings

* Cast db values to uint8array

* Fix db bug in ethash

* Remove unused peerId check

* Add pow miner test

* Finish test

* Move pow test to integration tests

* client: lint

* rename test helper

* Add timeout to PoW test

* Fix test runner

* remove console log

---------

Co-authored-by: Holger Drewes <Holger.Drewes@gmail.com>
Co-authored-by: g11tech <gajinder@g11.in>
Co-authored-by: Jochem Brouwer <jochembrouwer96@gmail.com>
Co-authored-by: Gabriel Rocheleau <contact@rockwaterweb.com>
  • Loading branch information
5 people committed Apr 4, 2023
1 parent 54dc2e3 commit fd2e4ea
Show file tree
Hide file tree
Showing 31 changed files with 170 additions and 67 deletions.
13 changes: 8 additions & 5 deletions packages/blockchain/src/db/manager.ts
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,11 @@ export class DBManager {
async getHeads(): Promise<{ [key: string]: Uint8Array }> {
const heads = await this.get(DBTarget.Heads)
for (const key of Object.keys(heads)) {
heads[key] = Uint8Array.from(heads[key])
// DB incorrectly stores the `uint8Array` representation of each head hash
// as a JSON object of key value pairs where the key is the array index
// and the value is the uint8 from that index of the original array
// so we convert it back to a Uint8Array before storing the heads
heads[key] = Uint8Array.from(Object.values(heads[key]))
}
return heads
}
Expand Down Expand Up @@ -213,13 +217,13 @@ export class DBManager {
if (this._cache[cacheString] === undefined) {
throw new Error(`Invalid cache: ${cacheString}`)
}

let value = this._cache[cacheString].get(dbKey)
if (!value) {
value = await this._db.get(dbKey, dbOpts)

if (value) {
this._cache[cacheString].set(dbKey, value)
if (value !== undefined) {
// Always cast values to Uint8Array since db sometimes returns values as `Buffer`
this._cache[cacheString].set(dbKey, Uint8Array.from(value))
}
}

Expand All @@ -236,7 +240,6 @@ export class DBManager {
const convertedOps: DBOpData[] = ops.map((op) => op.baseDBOp)
// update the current cache for each operation
ops.map((op) => op.updateCache(this._cache))

return this._db.batch(convertedOps as any)
}
}
2 changes: 1 addition & 1 deletion packages/blockchain/src/db/operation.ts
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,7 @@ export class DBOp {
if (operationTarget === DBTarget.Heads) {
dbOperation.baseDBOp.valueEncoding = 'json'
} else {
dbOperation.baseDBOp.valueEncoding = 'binary'
dbOperation.baseDBOp.valueEncoding = 'view'
}

return dbOperation
Expand Down
2 changes: 1 addition & 1 deletion packages/client/lib/config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -539,7 +539,7 @@ export class Config {
static async getClientKey(datadir: string, common: Common) {
const networkDir = `${datadir}/${common.chainName()}`
const db = this.getConfigDB(networkDir)
const encodingOpts = { keyEncoding: 'utf8', valueEncoding: 'buffer' }
const encodingOpts = { keyEncoding: 'utf8', valueEncoding: 'view' }
const dbKey = 'config:client_key'
let key
try {
Expand Down
2 changes: 1 addition & 1 deletion packages/client/lib/execution/level.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ import { MemoryLevel } from 'memory-level'
import type { BatchDBOp, DB } from '@ethereumjs/trie'
import type { AbstractLevel } from 'abstract-level'

export const ENCODING_OPTS = { keyEncoding: 'buffer', valueEncoding: 'buffer' }
export const ENCODING_OPTS = { keyEncoding: 'view', valueEncoding: 'view' }

/**
* LevelDB is a thin wrapper around the underlying levelup db,
Expand Down
4 changes: 2 additions & 2 deletions packages/client/lib/net/protocol/snapprotocol.ts
Original file line number Diff line number Diff line change
Expand Up @@ -87,8 +87,8 @@ export interface SnapProtocolMethods {
slots: StorageData[][]
proof: Uint8Array[]
}>
getByteCodes: (opts: GetByteCodesOpts) => Promise<{ reqId: bigint; codes: Buffer[] }>
getTrieNodes: (opts: GetTrieNodesOpts) => Promise<{ reqId: bigint; nodes: Buffer[] }>
getByteCodes: (opts: GetByteCodesOpts) => Promise<{ reqId: bigint; codes: Uint8Array[] }>
getTrieNodes: (opts: GetTrieNodesOpts) => Promise<{ reqId: bigint; nodes: Uint8Array[] }>
}

/**
Expand Down
8 changes: 1 addition & 7 deletions packages/client/lib/net/server/rlpxserver.ts
Original file line number Diff line number Diff line change
Expand Up @@ -282,13 +282,7 @@ export class RlpxServer extends Server {
}
})

this.rlpx.on('peer:error', (rlpxPeer: Devp2pRLPxPeer, error: Error) => {
const peerId = bytesToHex(rlpxPeer.getId() as Uint8Array)
if (peerId === null) {
return this.error(error)
}
this.error(error)
})
this.rlpx.on('peer:error', (rlpxPeer: Devp2pRLPxPeer, error: Error) => this.error(error))

this.rlpx.on('error', (e: Error) => this.error(e))

Expand Down
2 changes: 1 addition & 1 deletion packages/client/lib/sync/fetcher/accountfetcher.ts
Original file line number Diff line number Diff line change
Expand Up @@ -322,7 +322,7 @@ export class AccountFetcher extends Fetcher<JobTask, AccountData[], AccountData>
// build record of accounts that need storage slots to be fetched
const storageRoot: Uint8Array =
account.body[2] instanceof Uint8Array ? account.body[2] : Uint8Array.from(account.body[2])
if (equalsBytes(storageRoot, KECCAK256_RLP)) {
if (!equalsBytes(storageRoot, KECCAK256_RLP)) {
storageFetchRequests.push({
accountHash: account.hash,
storageRoot,
Expand Down
2 changes: 1 addition & 1 deletion packages/client/lib/util/metaDBManager.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ import type { Chain } from '../blockchain'
import type { Config } from '../config'
import type { AbstractLevel } from 'abstract-level'

const encodingOpts = { keyEncoding: 'buffer', valueEncoding: 'buffer' }
const encodingOpts = { keyEncoding: 'view', valueEncoding: 'view' }

/**
* Number prepended to the db key to avoid collisions
Expand Down
97 changes: 97 additions & 0 deletions packages/client/test/integration/pow.spec.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,97 @@
import { parseGethGenesisState } from '@ethereumjs/blockchain'
import { Common, Hardfork } from '@ethereumjs/common'
import { Address } from '@ethereumjs/util'
import { hexToBytes } from 'ethereum-cryptography/utils'
import { removeSync } from 'fs-extra'
import * as tape from 'tape'

import { Config } from '../../lib'
import { createInlineClient } from '../sim/simutils'

import type { EthereumClient } from '../../lib'

const pk = hexToBytes('95a602ff1ae30a2243f400dcf002561b9743b2ae9827b1008e3714a5cc1c0cfe')
const minerAddress = Address.fromPrivateKey(pk)

async function setupPowDevnet(prefundAddress: Address, cleanStart: boolean) {
if (cleanStart) {
removeSync(`datadir/devnet`)
}
const addr = prefundAddress.toString().slice(2)
const consensusConfig = { ethash: true }

const defaultChainData = {
config: {
chainId: 123456,
homesteadBlock: 0,
eip150Block: 0,
eip150Hash: '0x0000000000000000000000000000000000000000000000000000000000000000',
eip155Block: 0,
eip158Block: 0,
byzantiumBlock: 0,
constantinopleBlock: 0,
petersburgBlock: 0,
istanbulBlock: 0,
berlinBlock: 0,
londonBlock: 0,
...consensusConfig,
},
nonce: '0x0',
timestamp: '0x614b3731',
gasLimit: '0x47b760',
difficulty: '0x1',
mixHash: '0x0000000000000000000000000000000000000000000000000000000000000000',
coinbase: '0x0000000000000000000000000000000000000000',
number: '0x0',
gasUsed: '0x0',
parentHash: '0x0000000000000000000000000000000000000000000000000000000000000000',
baseFeePerGas: 7,
}
const extraData = '0x' + '0'.repeat(32)
const chainData = {
...defaultChainData,
extraData,
alloc: { [addr]: { balance: '0x10000000000000000000' } },
}
const common = Common.fromGethGenesis(chainData, { chain: 'devnet', hardfork: Hardfork.London })
const customGenesisState = parseGethGenesisState(chainData)

const config = new Config({
common,
transports: ['rlpx'],
bootnodes: [],
multiaddrs: [],
discDns: false,
discV4: false,
port: 30304,
maxAccountRange: (BigInt(2) ** BigInt(256) - BigInt(1)) / BigInt(10),
maxFetcherJobs: 10,
datadir: 'devnet',
accounts: [[minerAddress, pk]],
mine: true,
})

const client = await createInlineClient(config, common, customGenesisState)
return client
}

const mineBlockAndstopClient = async (client: EthereumClient, t: tape.Test) => {
await new Promise((resolve) => {
client.config.logger.on('data', (data) => {
if (data.message.includes('Miner: Found PoW solution') === true && client.started) {
t.pass('found a PoW solution')
void client.stop().then(() => {
t.ok(!client.started, 'client stopped successfully')
resolve(undefined)
})
}
})
})
}

tape('PoW client test', { timeout: 60000 }, async (t) => {
t.plan(3)
const client = await setupPowDevnet(minerAddress, true)
t.ok(client.started, 'client started successfully')
await mineBlockAndstopClient(client, t)
})
2 changes: 1 addition & 1 deletion packages/client/test/sync/fetcher/storagefetcher.spec.ts
Original file line number Diff line number Diff line change
Expand Up @@ -306,7 +306,7 @@ tape('[StorageFetcher]', async (t) => {
const dummyStorageRoot = hexToBytes(
'39ed8daab7679c0b1b7cf3667c50108185d4d9d1431c24a1c35f696a58277f8f'
)
const dummyOrigin = Buffer.alloc(32)
const dummyOrigin = new Uint8Array(32)
try {
await fetcher['verifyRangeProof'](dummyStorageRoot, dummyOrigin, {
slots,
Expand Down
2 changes: 1 addition & 1 deletion packages/devp2p/examples/simple.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
import { Chain, Common } from '@ethereumjs/common'
import chalk from 'chalk'
import { hexToBytes } from 'ethereum-cryptography/utils'
import { bytesToHex, hexToBytes } from 'ethereum-cryptography/utils'

import { DPT } from '../src/index'

Expand Down
2 changes: 1 addition & 1 deletion packages/devp2p/src/protocol/eth.ts
Original file line number Diff line number Diff line change
Expand Up @@ -227,7 +227,7 @@ export class ETH extends Protocol {
_getStatusString(status: ETH.StatusMsg) {
let sStr = `[V:${bytesToInt(status[0] as Uint8Array)}, NID:${bytesToInt(
status[1] as Uint8Array
)}, TD:${status[2].length === 0 ? 0 : bytesToInt(status[2] as Uint8Array)}`
)}, TD:${status[2].length === 0 ? 0 : bytesToBigInt(status[2] as Uint8Array).toString()}`
sStr += `, BestH:${formatLogId(
bytesToHex(status[3] as Uint8Array),
this._verbose
Expand Down
4 changes: 2 additions & 2 deletions packages/devp2p/src/rlpx/peer.ts
Original file line number Diff line number Diff line change
Expand Up @@ -587,11 +587,11 @@ export class Peer extends EventEmitter {
//
if (protocolName === 'Peer') {
try {
payload = RLP.decode(Uint8Array.from(payload))
payload = RLP.decode(payload)
} catch (e: any) {
if (msgCode === PREFIXES.DISCONNECT) {
if (compressed) {
payload = RLP.decode(Uint8Array.from(origPayload))
payload = RLP.decode(origPayload)
} else {
payload = RLP.decode(snappy.uncompress(payload))
}
Expand Down
2 changes: 1 addition & 1 deletion packages/devp2p/src/util.ts
Original file line number Diff line number Diff line change
Expand Up @@ -126,7 +126,7 @@ export function toNewUint8Array(buf: Uint8Array): Uint8Array {

/*************************** ************************************************************/
// Methods borrowed from `node-ip` by Fedor Indutny (https://github.com/indutny/node-ip)
// and modified to use Uint8Arrays instead of Buffers
// and modified to use Uint8Arrays instead of Uint8Arrays
export const ipToString = (bytes: Uint8Array, offset?: number, length?: number) => {
offset = offset !== undefined ? ~~offset : 0
length = length ?? bytes.length - offset
Expand Down
3 changes: 3 additions & 0 deletions packages/ethash/src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -318,6 +318,9 @@ export class Ethash {
let data
try {
data = await this.cacheDB!.get(epoc, this.dbOpts)
// Fix uint8Arrays that get stored in DB as JSON dictionary of array indices and values
data.seed = Uint8Array.from(Object.values(data.seed))
data.cache = data.cache.map((el) => Uint8Array.from(Object.values(el)))
} catch (error: any) {
if (error.code !== 'LEVEL_NOT_FOUND') {
throw error
Expand Down
12 changes: 6 additions & 6 deletions packages/ethash/test/miner.spec.ts
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,11 @@ import * as tape from 'tape'
import { Ethash } from '../src'

import type { BlockHeader } from '@ethereumjs/block'

const cacheDb = new MemoryLevel()
const common = new Common({ chain: Chain.Ropsten, hardfork: Hardfork.Petersburg })

tape('Check if miner works as expected', async function (t) {
const e = new Ethash(new MemoryLevel())
const e = new Ethash(cacheDb as any)

const block = Block.fromBlockData(
{
Expand Down Expand Up @@ -54,7 +54,7 @@ tape('Check if miner works as expected', async function (t) {
})

tape('Check if it is possible to mine Blocks and BlockHeaders', async function (t) {
const e = new Ethash(new MemoryLevel())
const e = new Ethash(cacheDb as any)

const block = Block.fromBlockData(
{
Expand Down Expand Up @@ -82,7 +82,7 @@ tape('Check if it is possible to mine Blocks and BlockHeaders', async function (
})

tape('Check if it is possible to stop the miner', async function (t) {
const e = new Ethash(new MemoryLevel())
const e = new Ethash(cacheDb as any)

const block = Block.fromBlockData(
{
Expand All @@ -104,7 +104,7 @@ tape('Check if it is possible to stop the miner', async function (t) {
})

tape('Check if it is possible to stop the miner', async function (t) {
const e = new Ethash(new MemoryLevel())
const e = new Ethash(cacheDb as any)

const block: any = {}

Expand All @@ -116,7 +116,7 @@ tape('Check if it is possible to stop the miner', async function (t) {
})

tape('Should keep common when mining blocks or headers', async function (t) {
const e = new Ethash(new MemoryLevel())
const e = new Ethash(cacheDb as any)

const block = Block.fromBlockData(
{
Expand Down
2 changes: 1 addition & 1 deletion packages/evm/src/memory.ts
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ export class Memory {
return loaded
}
const returnBytes = new Uint8Array(size)
// Copy the stored "buffer" from memory into the return Buffer
// Copy the stored "buffer" from memory into the return Uint8Array
returnBytes.set(loaded)

return returnBytes
Expand Down
2 changes: 1 addition & 1 deletion packages/evm/src/precompiles/0a-bls12-g1add.ts
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ export async function precompile0a(opts: PrecompileInput): Promise<ExecResult> {
}
}

// convert input to mcl G1 points, add them, and convert the output to a Buffer.
// convert input to mcl G1 points, add them, and convert the output to a Uint8Array.
let mclPoint1
let mclPoint2
try {
Expand Down
2 changes: 1 addition & 1 deletion packages/evm/src/precompiles/0b-bls12-g1mul.ts
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ export async function precompile0b(opts: PrecompileInput): Promise<ExecResult> {
}
}

// convert input to mcl G1 points, add them, and convert the output to a Buffer.
// convert input to mcl G1 points, add them, and convert the output to a Uint8Array.

let mclPoint
try {
Expand Down
2 changes: 1 addition & 1 deletion packages/evm/src/precompiles/0d-bls12-g2add.ts
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ export async function precompile0d(opts: PrecompileInput): Promise<ExecResult> {

// TODO: verify that point is on G2

// convert input to mcl G2 points, add them, and convert the output to a Buffer.
// convert input to mcl G2 points, add them, and convert the output to a Uint8Array.
let mclPoint1
let mclPoint2

Expand Down
2 changes: 1 addition & 1 deletion packages/evm/src/precompiles/0e-bls12-g2mul.ts
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ export async function precompile0e(opts: PrecompileInput): Promise<ExecResult> {

// TODO: verify that point is on G2

// convert input to mcl G2 point/Fr point, add them, and convert the output to a Buffer.
// convert input to mcl G2 point/Fr point, add them, and convert the output to a Uint8Array.
let mclPoint
try {
mclPoint = BLS12_381_ToG2Point(opts.data.subarray(0, 256), mcl)
Expand Down

0 comments on commit fd2e4ea

Please sign in to comment.