Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to run the default network #681

Open
snail-inO opened this issue Jun 18, 2024 · 6 comments
Open

Failed to run the default network #681

snail-inO opened this issue Jun 18, 2024 · 6 comments

Comments

@snail-inO
Copy link

Issue

I try to run a network with the default configs using the following command:
kurtosis run github.com/ethpandaops/ethereum-package

Platform

OS

Distributor ID:	Ubuntu
Description:	Ubuntu 22.04.4 LTS
Release:	22.04
Codename:	jammy

Docker

Client:
 Version:           24.0.5
 API version:       1.43
 Go version:        go1.20.14
 Git commit:        ced0996
 Built:             Thu Jun 13 22:15:30 2024
 OS/Arch:           linux/amd64
 Context:           default

Server:
 Engine:
  Version:          24.0.5
  API version:      1.43 (minimum version 1.12)
  Go version:       go1.20.14
  Git commit:       a61e2b4
  Built:            Thu Jun 13 22:15:59 2024
  OS/Arch:          linux/amd64
  Experimental:     false
 containerd:
  Version:          v1.6.21
  GitCommit:        3dce8eb055cbb6872793272b4f20ed16117344f8
 runc:
  Version:          1.1.7
  GitCommit:
 docker-init:
  Version:          0.19.0
  GitCommit:        de40ad0

Kurtosis

A Kurtosis engine is running with the following info:
Version:   0.90.1

Logs

INFO[2024-06-18T16:19:56-04:00] Creating a new enclave for Starlark to run inside...
INFO[2024-06-18T16:19:58-04:00] Enclave 'towering-playa' created successfully

Container images used in this run:
> protolambda/eth2-val-tools:latest - locally cached
> ethpandaops/execution-monitor:master - locally cached
> badouralix/curl-jq - locally cached
> python:3.11-alpine - locally cached
> grafana/grafana:latest-ubuntu - locally cached
> ethpandaops/tx-fuzz:master - locally cached
> ethpandaops/ethereum-genesis-generator:3.2.1 - locally cached
> ethpandaops/dora:master - locally cached
> ethpandaops/lighthouse:stable - locally cached
> ethereum/client-go:latest - locally cached
> prom/prometheus:latest - locally cached
> ethpandaops/beacon-metrics-gazer:master - locally cached

Uploading file '/static_files/jwt/jwtsecret' to files artifact 'jwt_file'
Files with artifact name 'jwt_file' uploaded with artifact UUID '3ec462c34c2d435c9a362b1b32049d4e'

Uploading file '/static_files/keymanager/keymanager.txt' to files artifact 'keymanager_file'
Files with artifact name 'keymanager_file' uploaded with artifact UUID 'a90fb74862664c53b22dc6239524a606'

Printing a message
Read the prometheus, grafana templates

Printing a message
Launching participant network with 1 participants and the following network params struct(additional_preloaded_contracts = {}, churn_limit_quotient = 65536, custody_requirement = 1, data_column_sidecar_subnet_count = 32, deneb_fork_epoch = 0, deposit_contract_address = "0x4242424242424242424242424242424242424242", eip7594_fork_epoch = 100000001, eip7594_fork_version = "0x70000038", ejection_balance = 16000000000, electra_fork_epoch = 100000000, eth1_follow_distance = 2048, genesis_delay = 20, max_per_epoch_activation_churn_limit = 8, min_validator_withdrawability_delay = 256, network = "kurtosis", network_id = "3151908", network_sync_base_url = "https://ethpandaops-ethereum-node-snapshots.ams3.cdn.digitaloceanspaces.com/", num_validator_keys_per_node = 64, preregistered_validator_count = 0, preregistered_validator_keys_mnemonic = "giant issue aisle success illegal bike spike question tent bar rely arctic volcano long crawl hungry vocal artwork sniff fantasy very lucky have athlete", preset = "mainnet", samples_per_slot = 8, seconds_per_slot = 12, shard_committee_period = 256, target_number_of_peers = 70)

Printing a message
Generating cl validator key stores

Adding service with name 'validator-key-generation-cl-validator-keystore' and image 'protolambda/eth2-val-tools:latest'
Service 'validator-key-generation-cl-validator-keystore' added with service UUID 'd204ee7434d744f38a2ca2f2e8b7801d'

Generating keystores
Command returned with exit code '0' with no output

Verifying whether two values meet a certain condition '=='
Verification succeeded. Value is '0'.

Storing files from service 'validator-key-generation-cl-validator-keystore' at path '/node-0-keystores/' to files artifact with name '1-lighthouse-geth-0-63-0'
Files with artifact name '1-lighthouse-geth-0-63-0' uploaded with artifact UUID '9e0c4e81ceba4a3e9988a10e518907b2'

Storing prysm password in a file
Command returned with exit code '0' with no output

Verifying whether two values meet a certain condition '=='
Verification succeeded. Value is '0'.

Storing files from service 'validator-key-generation-cl-validator-keystore' at path '/tmp/prysm-password.txt' to files artifact with name 'prysm-password'
Files with artifact name 'prysm-password' uploaded with artifact UUID '96067167a5114fa59af93c6fc78d991a'

Printing a message
{
	"per_node_keystores": [
		{
			"files_artifact_uuid": "1-lighthouse-geth-0-63-0",
			"nimbus_keys_relative_dirpath": "/nimbus-keys",
			"prysm_relative_dirpath": "/prysm",
			"raw_keys_relative_dirpath": "/keys",
			"raw_root_dirpath": "",
			"raw_secrets_relative_dirpath": "/secrets",
			"teku_keys_relative_dirpath": "/teku-keys",
			"teku_secrets_relative_dirpath": "/teku-secrets"
		}
	],
	"prysm_password_artifact_uuid": "prysm-password",
	"prysm_password_relative_filepath": "prysm-password.txt"
}

Getting final genesis timestamp
Command returned with exit code '0' and the following output: 1718742036

Printing a message
Generating EL CL data

Rendering a template to a files artifact with name 'genesis-el-cl-env-file'
Templates artifact name 'genesis-el-cl-env-file' rendered with artifact UUID 'ae6482dd94284937b7fc84b27845f5e9'

Creating genesis
Command returned with exit code '0' and the following output:
--------------------
+ '[' -f /data/metadata/genesis.json ']'
++ mktemp -d -t ci-XXXXXXXXXX
+ tmp_dir=/tmp/ci-QxTwsQ6zfK
+ mkdir -p /data/metadata
+ envsubst
+ python3 /apps/el-gen/genesis_geth.py /tmp/ci-QxTwsQ6zfK/genesis-config.yaml
+ python3 /apps/el-gen/genesis_chainspec.py /tmp/ci-QxTwsQ6zfK/genesis-config.yaml
+ python3 /apps/el-gen/genesis_besu.py /tmp/ci-QxTwsQ6zfK/genesis-config.yaml
+ gen_cl_config
+ . /apps/el-gen/.venv/bin/activate
++ deactivate nondestructive
++ '[' -n /root/.cargo/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin ']'
++ PATH=/root/.cargo/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
++ export PATH
++ unset _OLD_VIRTUAL_PATH
++ '[' -n '' ']'
++ '[' -n /bin/bash -o -n '' ']'
++ hash -r
++ '[' -n '' ']'
++ unset VIRTUAL_ENV
++ unset VIRTUAL_ENV_PROMPT
++ '[' '!' nondestructive = nondestructive ']'
++ VIRTUAL_ENV=/apps/el-gen/.venv
++ export VIRTUAL_ENV
++ _OLD_VIRTUAL_PATH=/root/.cargo/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
++ PATH=/apps/el-gen/.venv/bin:/root/.cargo/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
++ export PATH
++ '[' -n '' ']'
++ '[' -z '' ']'
++ _OLD_VIRTUAL_PS1='(.venv) '
++ PS1='(.venv) (.venv) '
++ export PS1
++ VIRTUAL_ENV_PROMPT='(.venv) '
++ export VIRTUAL_ENV_PROMPT
++ '[' -n /bin/bash -o -n '' ']'
++ hash -r
+ set -x
+ '[' -f /data/metadata/genesis.ssz ']'
++ mktemp -d -t ci-XXXXXXXXXX
+ tmp_dir=/tmp/ci-iIWIygkXzO
+ mkdir -p /data/metadata
+ envsubst
+ envsubst
+ [[ mainnet == \m\i\n\i\m\a\l ]]
+ cp /tmp/ci-iIWIygkXzO/mnemonics.yaml /data/metadata/mnemonics.yaml
+ grep DEPOSIT_CONTRACT_ADDRESS /data/metadata/config.yaml
+ cut -d ' ' -f2
+ echo 0
+ echo enr:-Iq4QJk4WqRkjsX5c2CXtOra6HnxN-BMXnWhmhEQO9Bn9iABTJGdjUOurM7Btj1ouKaFkvTRoju5vz2GPmVON2dffQKGAX53x8JigmlkgnY0gmlwhLKAlv6Jc2VjcDI1NmsxoQK6S-Cii_KmfFdUJL2TANL3ksaKUnNXvTCv1tLwXs0QgIN1ZHCCIyk
+ envsubst
+ genesis_args=(deneb --config /data/metadata/config.yaml --mnemonics $tmp_dir/mnemonics.yaml --tranches-dir /data/metadata/tranches --state-output /data/metadata/genesis.ssz --preset-phase0 $PRESET_BASE --preset-altair $PRESET_BASE --preset-bellatrix $PRESET_BASE --preset-capella $PRESET_BASE --preset-deneb $PRESET_BASE)
+ [[ 0x00 == \0\x\0\1 ]]
+ [[ '' != '' ]]
+ [[ '' != '' ]]
+ genesis_args+=(--eth1-config /data/metadata/genesis.json)
+ '[' -z '' ']'
+ zcli_args=(pretty deneb BeaconState --preset-phase0 $PRESET_BASE --preset-altair $PRESET_BASE --preset-bellatrix $PRESET_BASE --preset-capella $PRESET_BASE --preset-deneb $PRESET_BASE /data/metadata/genesis.ssz)
+ /usr/local/bin/eth2-testnet-genesis deneb --config /data/metadata/config.yaml --mnemonics /tmp/ci-iIWIygkXzO/mnemonics.yaml --tranches-dir /data/metadata/tranches --state-output /data/metadata/genesis.ssz --preset-phase0 mainnet --preset-altair mainnet --preset-bellatrix mainnet --preset-capella mainnet --preset-deneb mainnet --eth1-config /data/metadata/genesis.json
zrnt version: v0.32.3
Using CL MIN_GENESIS_TIME for genesis timestamp
processing mnemonic 0, for 64 validators
Writing pubkeys list file...
generated 64 validators from mnemonic yaml (/tmp/ci-iIWIygkXzO/mnemonics.yaml)
eth2 genesis at 1718742036 + 20 = 1718742056  (2024-06-18 20:20:56 +0000 UTC)
done preparing state, serializing SSZ now...
done!
+ /usr/local/bin/zcli pretty deneb BeaconState --preset-phase0 mainnet --preset-altair mainnet --preset-bellatrix mainnet --preset-capella mainnet --preset-deneb mainnet /data/metadata/genesis.ssz
+ echo 'Genesis args: deneb' --config /data/metadata/config.yaml --mnemonics /tmp/ci-iIWIygkXzO/mnemonics.yaml --tranches-dir /data/metadata/tranches --state-output /data/metadata/genesis.ssz --preset-phase0 mainnet --preset-altair mainnet --preset-bellatrix mainnet --preset-capella mainnet --preset-deneb mainnet --eth1-config /data/metadata/genesis.json
Genesis args: deneb --config /data/metadata/config.yaml --mnemonics /tmp/ci-iIWIygkXzO/mnemonics.yaml --tranches-dir /data/metadata/tranches --state-output /data/metadata/genesis.ssz --preset-phase0 mainnet --preset-altair mainnet --preset-bellatrix mainnet --preset-capella mainnet --preset-deneb mainnet --eth1-config /data/metadata/genesis.json
++ jq -r .latest_execution_payload_header.block_number /data/metadata/parsedBeaconState.json
+ echo 'Genesis block number: 0'
Genesis block number: 0
++ jq -r .latest_execution_payload_header.block_hash /data/metadata/parsedBeaconState.json
+ echo 'Genesis block hash: 0x1fc1e5cde3285fc616f5beef09b3eb6a6363ff7971ec1d146104bf350093d779'
Genesis block hash: 0x1fc1e5cde3285fc616f5beef09b3eb6a6363ff7971ec1d146104bf350093d779
+ jq -r .eth1_data.block_hash /data/metadata/parsedBeaconState.json
+ tr -d '\n'
+ jq -r .genesis_validators_root /data/metadata/parsedBeaconState.json
+ tr -d '\n'
+ gen_shared_files
+ . /apps/el-gen/.venv/bin/activate
++ deactivate nondestructive
++ '[' -n /root/.cargo/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin ']'
++ PATH=/root/.cargo/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
++ export PATH
++ unset _OLD_VIRTUAL_PATH
++ '[' -n '' ']'
++ '[' -n /bin/bash -o -n '' ']'
++ hash -r
++ '[' -n '(.venv) ' ']'
++ PS1='(.venv) '
++ export PS1
++ unset _OLD_VIRTUAL_PS1
++ unset VIRTUAL_ENV
++ unset VIRTUAL_ENV_PROMPT
++ '[' '!' nondestructive = nondestructive ']'
++ VIRTUAL_ENV=/apps/el-gen/.venv
++ export VIRTUAL_ENV
++ _OLD_VIRTUAL_PATH=/root/.cargo/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
++ PATH=/apps/el-gen/.venv/bin:/root/.cargo/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
++ export PATH
++ '[' -n '' ']'
++ '[' -z '' ']'
++ _OLD_VIRTUAL_PS1='(.venv) '
++ PS1='(.venv) (.venv) '
++ export PS1
++ VIRTUAL_ENV_PROMPT='(.venv) '
++ export VIRTUAL_ENV_PROMPT
++ '[' -n /bin/bash -o -n '' ']'
++ hash -r
+ set -x
+ mkdir -p /data/metadata
+ '[' -f /data/jwt/jwtsecret ']'
+ mkdir -p /data/jwt
++ openssl rand -hex 32
++ tr -d '\n'
+ echo -n 0x3006de8337ca84021d87bedac193f3a37917f752eacb783a20bb18aaceab71af
+ '[' -f /data/metadata/genesis.json ']'
++ cat /data/metadata/genesis.json
++ jq -r '.config.terminalTotalDifficulty | tostring'
+ terminalTotalDifficulty=0
+ sed -i 's/TERMINAL_TOTAL_DIFFICULTY:.*/TERMINAL_TOTAL_DIFFICULTY: 0/' /data/metadata/config.yaml
+ '[' false = true ']'

--------------------

Reading genesis validators root
Command returned with exit code '0' and the following output: 0xd61ea484febacfae5298d52a2b581f3e305a51f3112a9241b968dccf019f7b11

Reading cancun time from genesis
Command returned with exit code '0' and the following output: 0

Reading prague time from genesis
Command returned with exit code '0' and the following output: 40118742056

Adding service with name 'el-1-geth-lighthouse' and image 'ethereum/client-go:latest'
Service 'el-1-geth-lighthouse' added with service UUID '18e14c932cda422489464ad7144e8088'

Waiting for at most '15m0s' for service 'el-1-geth-lighthouse' to reach a certain state
Wait took 1 tries (20.49006ms in total). Assertion passed with following:
Request had response code '200' and body "{\"jsonrpc\":\"2.0\",\"id\":1,\"result\":{\"id\":\"fb332d9aca0d6bc10234c9ec7c41360a5d015117eab78dc086581ce505c9ac65\",\"name\":\"Geth/v1.14.4-unstable-74edc938/linux-amd64/go1.22.3\",\"enode\":\"enode://aad49a4d15df938f5cd546e1f0ca05e8f1c35a6460338dcc0283d39b5810044a0ea6c4c98d4f6cd8501f7bf9b3971916cd4bd933647e604870baf7eefce5a8b0@172.16.0.10:30303\",\"enr\":\"enr:-Ki4QCq05cXnkWntG4NUjvVQhURnVG0pMCw5Eg43RV50KnD3fD4jZT2YtWoazreQCercXTrpkpLLesgMVuDHQ6nERpCGAZAtAfjmg2V0aMzLhEWFXTiFCVdDbCiCaWSCdjSCaXCErBAAColzZWNwMjU2azGhAqrUmk0V35OPXNVG4fDKBejxw1pkYDONzAKD05tYEARKhHNuYXDAg3RjcIJ2X4N1ZHCCdl8\",\"ip\":\"172.16.0.10\",\"ports\":{\"discovery\":30303,\"listener\":30303},\"listenAddr\":\"[::]:30303\",\"protocols\":{\"eth\":{\"network\":3151908,\"difficulty\":1,\"genesis\":\"0x1fc1e5cde3285fc616f5beef09b3eb6a6363ff7971ec1d146104bf350093d779\",\"config\":{\"chainId\":3151908,\"homesteadBlock\":0,\"eip150Block\":0,\"eip155Block\":0,\"eip158Block\":0,\"byzantiumBlock\":0,\"constantinopleBlock\":0,\"petersburgBlock\":0,\"istanbulBlock\":0,\"berlinBlock\":0,\"londonBlock\":0,\"mergeNetsplitBlock\":0,\"shanghaiTime\":0,\"cancunTime\":0,\"pragueTime\":40118742056,\"terminalTotalDifficulty\":0,\"terminalTotalDifficultyPassed\":true},\"head\":\"0x1fc1e5cde3285fc616f5beef09b3eb6a6363ff7971ec1d146104bf350093d779\"},\"snap\":{}}}}\n", with extracted fields:
'extract.enode': "enode://aad49a4d15df938f5cd546e1f0ca05e8f1c35a6460338dcc0283d39b5810044a0ea6c4c98d4f6cd8501f7bf9b3971916cd4bd933647e604870baf7eefce5a8b0@172.16.0.10:30303"
'extract.enr': "enr:-Ki4QCq05cXnkWntG4NUjvVQhURnVG0pMCw5Eg43RV50KnD3fD4jZT2YtWoazreQCercXTrpkpLLesgMVuDHQ6nERpCGAZAtAfjmg2V0aMzLhEWFXTiFCVdDbCiCaWSCdjSCaXCErBAAColzZWNwMjU2azGhAqrUmk0V35OPXNVG4fDKBejxw1pkYDONzAKD05tYEARKhHNuYXDAg3RjcIJ2X4N1ZHCCdl8"

Printing a message
Successfully added 1 EL participants

Printing a message
Launching CL network

Adding service with name 'cl-1-lighthouse-geth' and image 'ethpandaops/lighthouse:stable'
There was an error executing Starlark code
An error occurred executing instruction (number 25) at github.com/ethpandaops/ethereum-package/src/cl/lighthouse/lighthouse_launcher.star[157:38]:
  add_service(name="cl-1-lighthouse-geth", config=ServiceConfig(image="ethpandaops/lighthouse:stable", ports={"http": PortSpec(number=4000, transport_protocol="TCP", application_protocol="http"), "metrics": PortSpec(number=5054, transport_protocol="TCP", application_protocol="http"), "tcp-discovery": PortSpec(number=9000, transport_protocol="TCP", application_protocol=""), "udp-discovery": PortSpec(number=9000, transport_protocol="UDP", application_protocol="")}, public_ports={}, files={"/jwt": "jwt_file", "/network-configs": "el_cl_genesis_data"}, cmd=["lighthouse", "beacon_node", "--debug-level=info", "--datadir=/data/lighthouse/beacon-data", "--disable-enr-auto-update", "--enr-address=KURTOSIS_IP_ADDR_PLACEHOLDER", "--enr-udp-port=9000", "--enr-tcp-port=9000", "--listen-address=0.0.0.0", "--port=9000", "--http", "--http-address=0.0.0.0", "--http-port=4000", "--http-allow-sync-stalled", "--slots-per-restore-point=32", "--disable-packet-filter", "--execution-endpoints=http://{{kurtosis:fb0f2406e5774bd38202073a75434ea5:ip_address.runtime_value}}:8551", "--jwt-secrets=/jwt/jwtsecret", "--suggested-fee-recipient=0x8943545177806ED17B9F23F0a21ee5948eCaa776", "--subscribe-all-subnets", "--metrics", "--metrics-address=0.0.0.0", "--metrics-allow-origin=*", "--metrics-port=5054", "--enable-private-discovery", "--testnet-dir=/network-configs"], env_vars={"RUST_BACKTRACE": "full"}, private_ip_address_placeholder="KURTOSIS_IP_ADDR_PLACEHOLDER", max_cpu=1000, min_cpu=50, max_memory=1024, min_memory=256, ready_conditions=ReadyCondition(recipe=GetHttpRequestRecipe(port_id="http", endpoint="/eth/v1/node/health"), field="code", assertion="IN", target_value=[200, 206], timeout="15m"), labels={"ethereum-package.client": "lighthouse", "ethereum-package.client-image": "ethpandaops-lighthouse_stable", "ethereum-package.client-type": "beacon", "ethereum-package.connected-client": "geth", "ethereum-package.sha256": ""}, tolerations=[], node_selectors={}))
  Caused by: Unexpected error occurred starting service 'cl-1-lighthouse-geth'
  Caused by: An error occurred waiting for all TCP and UDP ports to be open for service 'cl-1-lighthouse-geth' with private IP '172.16.0.11'; this is usually due to a misconfiguration in the service itself, so here are the logs:
  == SERVICE 'cl-1-lighthouse-geth' LOGS ===================================
  Jun 18 20:20:24.100 INFO Logging to file                         path: "/data/lighthouse/beacon-data/beacon/logs/beacon.log"
  Jun 18 20:20:24.109 INFO Lighthouse started                      version: Lighthouse/v5.2.0-f1d88ba
  Jun 18 20:20:24.109 INFO Configured for network                  name: custom (/network-configs)
  Jun 18 20:20:24.109 INFO Data directory initialised              datadir: /data/lighthouse/beacon-data
  Jun 18 20:20:24.110 WARN Discv5 packet filter is disabled
  Jun 18 20:20:24.110 WARN Ignoring --http-allow-sync-stalled      info: this flag is deprecated and will be removed
  Jun 18 20:20:24.117 INFO Deposit contract                        address: 0x4242424242424242424242424242424242424242, deploy_block: 0
  Jun 18 20:20:24.128 INFO Blob DB initialized                     oldest_blob_slot: Some(Slot(0)), path: "/data/lighthouse/beacon-data/beacon/blobs_db", service: freezer_db
  Jun 18 20:20:24.154 INFO Starting from known genesis state       service: beacon

  == FINISHED SERVICE 'cl-1-lighthouse-geth' LOGS ===================================
  Caused by: An error occurred while waiting for all TCP and UDP ports to be open
  Caused by: Unsuccessful ports check for IP '172.16.0.11' and port spec '{privatePortSpec:0xc000410480}', even after '240' retries with '500' milliseconds in between retries. Timeout '2m0s' has been reached
  Caused by: An error occurred while calling network address '172.16.0.11:5054' with port protocol 'TCP' and using time out '200ms'
  Caused by: dial tcp 172.16.0.11:5054: connect: no route to host

Error encountered running Starlark code.

⭐ us on GitHub - https://github.com/kurtosis-tech/kurtosis
INFO[2024-06-18T16:22:24-04:00] =======================================================
INFO[2024-06-18T16:22:24-04:00] ||          Created enclave: towering-playa          ||
INFO[2024-06-18T16:22:24-04:00] =======================================================
Name:            towering-playa
UUID:            53190be6bade
Status:          RUNNING
Creation Time:   Tue, 18 Jun 2024 16:19:56 EDT
Flags:

========================================= Files Artifacts =========================================
UUID           Name
9e0c4e81ceba   1-lighthouse-geth-0-63-0
9b03dd35555a   el_cl_genesis_data
9c191b80a0db   final-genesis-timestamp
ae6482dd9428   genesis-el-cl-env-file
cdddf1f2d889   genesis_validators_root
3ec462c34c2d   jwt_file
a90fb7486266   keymanager_file
96067167a511   prysm-password

========================================== User Services ==========================================
UUID           Name                                             Ports                                         Status
18e14c932cda   el-1-geth-lighthouse                             engine-rpc: 8551/tcp -> 127.0.0.1:32922       RUNNING
                                                                metrics: 9001/tcp -> 127.0.0.1:32921
                                                                rpc: 8545/tcp -> http://127.0.0.1:32924
                                                                tcp-discovery: 30303/tcp -> 127.0.0.1:32920
                                                                udp-discovery: 30303/udp -> 127.0.0.1:32794
                                                                ws: 8546/tcp -> 127.0.0.1:32923
d204ee7434d7   validator-key-generation-cl-validator-keystore   <none>                                        RUNNING

@barnabasbusa
Copy link
Contributor

You might have an older image locally cached. Can you please run it with --image-download always?

I can't replicate this issue locally.

@snail-inO
Copy link
Author

You might have an older image locally cached. Can you please run it with --image-download always?

I can't replicate this issue locally.

Thanks for your solution, but it didn't work out, the following are the logs:

INFO[2024-06-19T14:00:41-04:00] Creating a new enclave for Starlark to run inside...
INFO[2024-06-19T14:00:43-04:00] Enclave 'aged-fjord' created successfully

Container images used in this run:
> grafana/grafana:latest-ubuntu - remotely downloaded
> protolambda/eth2-val-tools:latest - remotely downloaded
> ethereum/client-go:latest - remotely downloaded
> python:3.11-alpine - remotely downloaded
> ethpandaops/execution-monitor:master - remotely downloaded
> badouralix/curl-jq - remotely downloaded
> ethpandaops/lighthouse:stable - remotely downloaded
> ethpandaops/ethereum-genesis-generator:3.2.1 - remotely downloaded
> ethpandaops/dora:master - remotely downloaded
> ethpandaops/beacon-metrics-gazer:master - remotely downloaded
> ethpandaops/tx-fuzz:master - remotely downloaded
> prom/prometheus:latest - remotely downloaded

Uploading file '/static_files/jwt/jwtsecret' to files artifact 'jwt_file'
Files with artifact name 'jwt_file' uploaded with artifact UUID '57c0b55a8fa148dcacaf31d498b15b2b'

Uploading file '/static_files/keymanager/keymanager.txt' to files artifact 'keymanager_file'
Files with artifact name 'keymanager_file' uploaded with artifact UUID 'cfae53fe5b594c0986436757f6501040'

Printing a message
Read the prometheus, grafana templates

Printing a message
Launching participant network with 1 participants and the following network params struct(additional_preloaded_contracts = {}, churn_limit_quotient = 65536, custody_requirement = 1, data_column_sidecar_subnet_count = 32, deneb_fork_epoch = 0, deposit_contract_address = "0x4242424242424242424242424242424242424242", eip7594_fork_epoch = 100000001, eip7594_fork_version = "0x70000038", ejection_balance = 16000000000, electra_fork_epoch = 100000000, eth1_follow_distance = 2048, genesis_delay = 20, max_per_epoch_activation_churn_limit = 8, min_validator_withdrawability_delay = 256, network = "kurtosis", network_id = "3151908", network_sync_base_url = "https://ethpandaops-ethereum-node-snapshots.ams3.cdn.digitaloceanspaces.com/", num_validator_keys_per_node = 64, preregistered_validator_count = 0, preregistered_validator_keys_mnemonic = "giant issue aisle success illegal bike spike question tent bar rely arctic volcano long crawl hungry vocal artwork sniff fantasy very lucky have athlete", preset = "mainnet", samples_per_slot = 8, seconds_per_slot = 12, shard_committee_period = 256, target_number_of_peers = 70)

Printing a message
Generating cl validator key stores

Adding service with name 'validator-key-generation-cl-validator-keystore' and image 'protolambda/eth2-val-tools:latest'
Service 'validator-key-generation-cl-validator-keystore' added with service UUID '2bf7b0bc2d4e41288a216ecc317ebf55'

Generating keystores
Command returned with exit code '0' with no output

Verifying whether two values meet a certain condition '=='
Verification succeeded. Value is '0'.

Storing files from service 'validator-key-generation-cl-validator-keystore' at path '/node-0-keystores/' to files artifact with name '1-lighthouse-geth-0-63-0'
Files with artifact name '1-lighthouse-geth-0-63-0' uploaded with artifact UUID '41e0b44392ae427f855097d28c3d4fc1'

Storing prysm password in a file
Command returned with exit code '0' with no output

Verifying whether two values meet a certain condition '=='
Verification succeeded. Value is '0'.

Storing files from service 'validator-key-generation-cl-validator-keystore' at path '/tmp/prysm-password.txt' to files artifact with name 'prysm-password'
Files with artifact name 'prysm-password' uploaded with artifact UUID '3e76199647414294bed899026d08ceb9'

Printing a message
{
	"per_node_keystores": [
		{
			"files_artifact_uuid": "1-lighthouse-geth-0-63-0",
			"nimbus_keys_relative_dirpath": "/nimbus-keys",
			"prysm_relative_dirpath": "/prysm",
			"raw_keys_relative_dirpath": "/keys",
			"raw_root_dirpath": "",
			"raw_secrets_relative_dirpath": "/secrets",
			"teku_keys_relative_dirpath": "/teku-keys",
			"teku_secrets_relative_dirpath": "/teku-secrets"
		}
	],
	"prysm_password_artifact_uuid": "prysm-password",
	"prysm_password_relative_filepath": "prysm-password.txt"
}

Getting final genesis timestamp
Command returned with exit code '0' and the following output: 1718820082

Printing a message
Generating EL CL data

Rendering a template to a files artifact with name 'genesis-el-cl-env-file'
Templates artifact name 'genesis-el-cl-env-file' rendered with artifact UUID '263f5cdc876b4edbbe96edde96bd159c'

Creating genesis
Command returned with exit code '0' and the following output:
--------------------
+ '[' -f /data/metadata/genesis.json ']'
++ mktemp -d -t ci-XXXXXXXXXX
+ tmp_dir=/tmp/ci-lNLHBHg4uN
+ mkdir -p /data/metadata
+ envsubst
+ python3 /apps/el-gen/genesis_geth.py /tmp/ci-lNLHBHg4uN/genesis-config.yaml
+ python3 /apps/el-gen/genesis_chainspec.py /tmp/ci-lNLHBHg4uN/genesis-config.yaml
+ python3 /apps/el-gen/genesis_besu.py /tmp/ci-lNLHBHg4uN/genesis-config.yaml
+ gen_cl_config
+ . /apps/el-gen/.venv/bin/activate
++ deactivate nondestructive
++ '[' -n /root/.cargo/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin ']'
++ PATH=/root/.cargo/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
++ export PATH
++ unset _OLD_VIRTUAL_PATH
++ '[' -n '' ']'
++ '[' -n /bin/bash -o -n '' ']'
++ hash -r
++ '[' -n '' ']'
++ unset VIRTUAL_ENV
++ unset VIRTUAL_ENV_PROMPT
++ '[' '!' nondestructive = nondestructive ']'
++ VIRTUAL_ENV=/apps/el-gen/.venv
++ export VIRTUAL_ENV
++ _OLD_VIRTUAL_PATH=/root/.cargo/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
++ PATH=/apps/el-gen/.venv/bin:/root/.cargo/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
++ export PATH
++ '[' -n '' ']'
++ '[' -z '' ']'
++ _OLD_VIRTUAL_PS1='(.venv) '
++ PS1='(.venv) (.venv) '
++ export PS1
++ VIRTUAL_ENV_PROMPT='(.venv) '
++ export VIRTUAL_ENV_PROMPT
++ '[' -n /bin/bash -o -n '' ']'
++ hash -r
+ set -x
+ '[' -f /data/metadata/genesis.ssz ']'
++ mktemp -d -t ci-XXXXXXXXXX
+ tmp_dir=/tmp/ci-olTrPuoGCA
+ mkdir -p /data/metadata
+ envsubst
+ envsubst
+ [[ mainnet == \m\i\n\i\m\a\l ]]
+ cp /tmp/ci-olTrPuoGCA/mnemonics.yaml /data/metadata/mnemonics.yaml
+ grep DEPOSIT_CONTRACT_ADDRESS /data/metadata/config.yaml
+ cut -d ' ' -f2
+ echo 0
+ echo enr:-Iq4QJk4WqRkjsX5c2CXtOra6HnxN-BMXnWhmhEQO9Bn9iABTJGdjUOurM7Btj1ouKaFkvTRoju5vz2GPmVON2dffQKGAX53x8JigmlkgnY0gmlwhLKAlv6Jc2VjcDI1NmsxoQK6S-Cii_KmfFdUJL2TANL3ksaKUnNXvTCv1tLwXs0QgIN1ZHCCIyk
+ envsubst
+ genesis_args=(deneb --config /data/metadata/config.yaml --mnemonics $tmp_dir/mnemonics.yaml --tranches-dir /data/metadata/tranches --state-output /data/metadata/genesis.ssz --preset-phase0 $PRESET_BASE --preset-altair $PRESET_BASE --preset-bellatrix $PRESET_BASE --preset-capella $PRESET_BASE --preset-deneb $PRESET_BASE)
+ [[ 0x00 == \0\x\0\1 ]]
+ [[ '' != '' ]]
+ [[ '' != '' ]]
+ genesis_args+=(--eth1-config /data/metadata/genesis.json)
+ '[' -z '' ']'
+ zcli_args=(pretty deneb BeaconState --preset-phase0 $PRESET_BASE --preset-altair $PRESET_BASE --preset-bellatrix $PRESET_BASE --preset-capella $PRESET_BASE --preset-deneb $PRESET_BASE /data/metadata/genesis.ssz)
+ /usr/local/bin/eth2-testnet-genesis deneb --config /data/metadata/config.yaml --mnemonics /tmp/ci-olTrPuoGCA/mnemonics.yaml --tranches-dir /data/metadata/tranches --state-output /data/metadata/genesis.ssz --preset-phase0 mainnet --preset-altair mainnet --preset-bellatrix mainnet --preset-capella mainnet --preset-deneb mainnet --eth1-config /data/metadata/genesis.json
zrnt version: v0.32.3
Using CL MIN_GENESIS_TIME for genesis timestamp
processing mnemonic 0, for 64 validators
Writing pubkeys list file...
generated 64 validators from mnemonic yaml (/tmp/ci-olTrPuoGCA/mnemonics.yaml)
eth2 genesis at 1718820082 + 20 = 1718820102  (2024-06-19 18:01:42 +0000 UTC)
done preparing state, serializing SSZ now...
done!
+ /usr/local/bin/zcli pretty deneb BeaconState --preset-phase0 mainnet --preset-altair mainnet --preset-bellatrix mainnet --preset-capella mainnet --preset-deneb mainnet /data/metadata/genesis.ssz
Genesis args: deneb --config /data/metadata/config.yaml --mnemonics /tmp/ci-olTrPuoGCA/mnemonics.yaml --tranches-dir /data/metadata/tranches --state-output /data/metadata/genesis.ssz --preset-phase0 mainnet --preset-altair mainnet --preset-bellatrix mainnet --preset-capella mainnet --preset-deneb mainnet --eth1-config /data/metadata/genesis.json
+ echo 'Genesis args: deneb' --config /data/metadata/config.yaml --mnemonics /tmp/ci-olTrPuoGCA/mnemonics.yaml --tranches-dir /data/metadata/tranches --state-output /data/metadata/genesis.ssz --preset-phase0 mainnet --preset-altair mainnet --preset-bellatrix mainnet --preset-capella mainnet --preset-deneb mainnet --eth1-config /data/metadata/genesis.json
++ jq -r .latest_execution_payload_header.block_number /data/metadata/parsedBeaconState.json
+ echo 'Genesis block number: 0'
Genesis block number: 0
++ jq -r .latest_execution_payload_header.block_hash /data/metadata/parsedBeaconState.json
+ echo 'Genesis block hash: 0x3d0a48a9a3b7eb9ae754baaeeb6e8b96c24307499bd9d97424e76e0fcd5fd113'
Genesis block hash: 0x3d0a48a9a3b7eb9ae754baaeeb6e8b96c24307499bd9d97424e76e0fcd5fd113
+ jq -r .eth1_data.block_hash /data/metadata/parsedBeaconState.json
+ tr -d '\n'
+ jq -r .genesis_validators_root /data/metadata/parsedBeaconState.json
+ tr -d '\n'
+ gen_shared_files
+ . /apps/el-gen/.venv/bin/activate
++ deactivate nondestructive
++ '[' -n /root/.cargo/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin ']'
++ PATH=/root/.cargo/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
++ export PATH
++ unset _OLD_VIRTUAL_PATH
++ '[' -n '' ']'
++ '[' -n /bin/bash -o -n '' ']'
++ hash -r
++ '[' -n '(.venv) ' ']'
++ PS1='(.venv) '
++ export PS1
++ unset _OLD_VIRTUAL_PS1
++ unset VIRTUAL_ENV
++ unset VIRTUAL_ENV_PROMPT
++ '[' '!' nondestructive = nondestructive ']'
++ VIRTUAL_ENV=/apps/el-gen/.venv
++ export VIRTUAL_ENV
++ _OLD_VIRTUAL_PATH=/root/.cargo/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
++ PATH=/apps/el-gen/.venv/bin:/root/.cargo/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
++ export PATH
++ '[' -n '' ']'
++ '[' -z '' ']'
++ _OLD_VIRTUAL_PS1='(.venv) '
++ PS1='(.venv) (.venv) '
++ export PS1
++ VIRTUAL_ENV_PROMPT='(.venv) '
++ export VIRTUAL_ENV_PROMPT
++ '[' -n /bin/bash -o -n '' ']'
++ hash -r
+ set -x
+ mkdir -p /data/metadata
+ '[' -f /data/jwt/jwtsecret ']'
+ mkdir -p /data/jwt
++ openssl rand -hex 32
++ tr -d '\n'
+ echo -n 0x4610504046ecfa9f1a0b28a3be8cf4fd9f98fd03bcfc4871fa0ca91b99762844
+ '[' -f /data/metadata/genesis.json ']'
++ cat /data/metadata/genesis.json
++ jq -r '.config.terminalTotalDifficulty | tostring'
+ terminalTotalDifficulty=0
+ sed -i 's/TERMINAL_TOTAL_DIFFICULTY:.*/TERMINAL_TOTAL_DIFFICULTY: 0/' /data/metadata/config.yaml
+ '[' false = true ']'

--------------------

Reading genesis validators root
Command returned with exit code '0' and the following output: 0xd61ea484febacfae5298d52a2b581f3e305a51f3112a9241b968dccf019f7b11

Reading cancun time from genesis
Command returned with exit code '0' and the following output: 0

Reading prague time from genesis
Command returned with exit code '0' and the following output: 40118820102

Adding service with name 'el-1-geth-lighthouse' and image 'ethereum/client-go:latest'
Service 'el-1-geth-lighthouse' added with service UUID 'bfbd3f0bd4b24d0e8a61e4fff1d68281'

Waiting for at most '15m0s' for service 'el-1-geth-lighthouse' to reach a certain state
Wait took 1 tries (19.058992ms in total). Assertion passed with following:
Request had response code '200' and body "{\"jsonrpc\":\"2.0\",\"id\":1,\"result\":{\"id\":\"b08a32a445504854d1dfae6b4a117cb44acff83c1ad606f43dd5c2ba8e5f734c\",\"name\":\"Geth/v1.14.6-unstable-27008408/linux-amd64/go1.22.4\",\"enode\":\"enode://cad3157d92528ebba7b85bda3e0248ee83eb22e53fa2a9cfd68f3139063db1748445d3fc4247fd258b6d2d5a6829720e48a21f7f42bac09949597283e70704e0@172.16.0.10:30303\",\"enr\":\"enr:-Ki4QGf9EAr6iEL-QSivLgXiLFcvYMH9fZlmhuz2A0GRg2gsVqQSatkiNDK0NT4RpKU2fmazWpqlSuXQPjx7IpSOAQyGAZAxqNwug2V0aMzLhEtboJ6FCVdEnQaCaWSCdjSCaXCErBAAColzZWNwMjU2azGhAsrTFX2SUo67p7hb2j4CSO6D6yLlP6Kpz9aPMTkGPbF0hHNuYXDAg3RjcIJ2X4N1ZHCCdl8\",\"ip\":\"172.16.0.10\",\"ports\":{\"discovery\":30303,\"listener\":30303},\"listenAddr\":\"[::]:30303\",\"protocols\":{\"eth\":{\"network\":3151908,\"difficulty\":1,\"genesis\":\"0x3d0a48a9a3b7eb9ae754baaeeb6e8b96c24307499bd9d97424e76e0fcd5fd113\",\"config\":{\"chainId\":3151908,\"homesteadBlock\":0,\"eip150Block\":0,\"eip155Block\":0,\"eip158Block\":0,\"byzantiumBlock\":0,\"constantinopleBlock\":0,\"petersburgBlock\":0,\"istanbulBlock\":0,\"berlinBlock\":0,\"londonBlock\":0,\"mergeNetsplitBlock\":0,\"shanghaiTime\":0,\"cancunTime\":0,\"pragueTime\":40118820102,\"terminalTotalDifficulty\":0,\"terminalTotalDifficultyPassed\":true},\"head\":\"0x3d0a48a9a3b7eb9ae754baaeeb6e8b96c24307499bd9d97424e76e0fcd5fd113\"},\"snap\":{}}}}\n", with extracted fields:
'extract.enode': "enode://cad3157d92528ebba7b85bda3e0248ee83eb22e53fa2a9cfd68f3139063db1748445d3fc4247fd258b6d2d5a6829720e48a21f7f42bac09949597283e70704e0@172.16.0.10:30303"
'extract.enr': "enr:-Ki4QGf9EAr6iEL-QSivLgXiLFcvYMH9fZlmhuz2A0GRg2gsVqQSatkiNDK0NT4RpKU2fmazWpqlSuXQPjx7IpSOAQyGAZAxqNwug2V0aMzLhEtboJ6FCVdEnQaCaWSCdjSCaXCErBAAColzZWNwMjU2azGhAsrTFX2SUo67p7hb2j4CSO6D6yLlP6Kpz9aPMTkGPbF0hHNuYXDAg3RjcIJ2X4N1ZHCCdl8"

Printing a message
Successfully added 1 EL participants

Printing a message
Launching CL network

Adding service with name 'cl-1-lighthouse-geth' and image 'ethpandaops/lighthouse:stable'
There was an error executing Starlark code
An error occurred executing instruction (number 25) at github.com/ethpandaops/ethereum-package/src/cl/lighthouse/lighthouse_launcher.star[157:38]:
  add_service(name="cl-1-lighthouse-geth", config=ServiceConfig(image="ethpandaops/lighthouse:stable", ports={"http": PortSpec(number=4000, transport_protocol="TCP", application_protocol="http"), "metrics": PortSpec(number=5054, transport_protocol="TCP", application_protocol="http"), "tcp-discovery": PortSpec(number=9000, transport_protocol="TCP", application_protocol=""), "udp-discovery": PortSpec(number=9000, transport_protocol="UDP", application_protocol="")}, public_ports={}, files={"/jwt": "jwt_file", "/network-configs": "el_cl_genesis_data"}, cmd=["lighthouse", "beacon_node", "--debug-level=info", "--datadir=/data/lighthouse/beacon-data", "--disable-enr-auto-update", "--enr-address=KURTOSIS_IP_ADDR_PLACEHOLDER", "--enr-udp-port=9000", "--enr-tcp-port=9000", "--listen-address=0.0.0.0", "--port=9000", "--http", "--http-address=0.0.0.0", "--http-port=4000", "--http-allow-sync-stalled", "--slots-per-restore-point=32", "--disable-packet-filter", "--execution-endpoints=http://{{kurtosis:659ae255a29a47c782cd3c3b2ccd1f52:ip_address.runtime_value}}:8551", "--jwt-secrets=/jwt/jwtsecret", "--suggested-fee-recipient=0x8943545177806ED17B9F23F0a21ee5948eCaa776", "--subscribe-all-subnets", "--metrics", "--metrics-address=0.0.0.0", "--metrics-allow-origin=*", "--metrics-port=5054", "--enable-private-discovery", "--testnet-dir=/network-configs"], env_vars={"RUST_BACKTRACE": "full"}, private_ip_address_placeholder="KURTOSIS_IP_ADDR_PLACEHOLDER", max_cpu=1000, min_cpu=50, max_memory=1024, min_memory=256, ready_conditions=ReadyCondition(recipe=GetHttpRequestRecipe(port_id="http", endpoint="/eth/v1/node/health"), field="code", assertion="IN", target_value=[200, 206], timeout="15m"), labels={"ethereum-package.client": "lighthouse", "ethereum-package.client-image": "ethpandaops-lighthouse_stable", "ethereum-package.client-type": "beacon", "ethereum-package.connected-client": "geth", "ethereum-package.sha256": ""}, tolerations=[], node_selectors={}))
  Caused by: Unexpected error occurred starting service 'cl-1-lighthouse-geth'
  Caused by: An error occurred waiting for all TCP and UDP ports to be open for service 'cl-1-lighthouse-geth' with private IP '172.16.0.11'; this is usually due to a misconfiguration in the service itself, so here are the logs:
  == SERVICE 'cl-1-lighthouse-geth' LOGS ===================================
  Jun 19 18:01:10.164 INFO Logging to file                         path: "/data/lighthouse/beacon-data/beacon/logs/beacon.log"
  Jun 19 18:01:10.173 INFO Lighthouse started                      version: Lighthouse/v5.2.0-f1d88ba
  Jun 19 18:01:10.174 INFO Configured for network                  name: custom (/network-configs)
  Jun 19 18:01:10.174 INFO Data directory initialised              datadir: /data/lighthouse/beacon-data
  Jun 19 18:01:10.174 WARN Discv5 packet filter is disabled
  Jun 19 18:01:10.174 WARN Ignoring --http-allow-sync-stalled      info: this flag is deprecated and will be removed
  Jun 19 18:01:10.181 INFO Deposit contract                        address: 0x4242424242424242424242424242424242424242, deploy_block: 0
  Jun 19 18:01:10.193 INFO Blob DB initialized                     oldest_blob_slot: Some(Slot(0)), path: "/data/lighthouse/beacon-data/beacon/blobs_db", service: freezer_db
  Jun 19 18:01:10.218 INFO Starting from known genesis state       service: beacon

  == FINISHED SERVICE 'cl-1-lighthouse-geth' LOGS ===================================
  Caused by: An error occurred while waiting for all TCP and UDP ports to be open
  Caused by: Unsuccessful ports check for IP '172.16.0.11' and port spec '{privatePortSpec:0xc000af00f0}', even after '240' retries with '500' milliseconds in between retries. Timeout '2m0s' has been reached
  Caused by: An error occurred while calling network address '172.16.0.11:9000' with port protocol 'TCP' and using time out '200ms'
  Caused by: dial tcp 172.16.0.11:9000: connect: no route to host

Error encountered running Starlark code.

⭐ us on GitHub - https://github.com/kurtosis-tech/kurtosis
INFO[2024-06-19T14:03:10-04:00] ===================================================
INFO[2024-06-19T14:03:10-04:00] ||          Created enclave: aged-fjord          ||
INFO[2024-06-19T14:03:10-04:00] ===================================================
Name:            aged-fjord
UUID:            12d4a4d65e81
Status:          RUNNING
Creation Time:   Wed, 19 Jun 2024 14:00:41 EDT
Flags:

========================================= Files Artifacts =========================================
UUID           Name
41e0b44392ae   1-lighthouse-geth-0-63-0
04fe74888bf0   el_cl_genesis_data
9e6096891a99   final-genesis-timestamp
263f5cdc876b   genesis-el-cl-env-file
9c8e2b4fc7b6   genesis_validators_root
57c0b55a8fa1   jwt_file
cfae53fe5b59   keymanager_file
3e7619964741   prysm-password

========================================== User Services ==========================================
UUID           Name                                             Ports                                         Status
bfbd3f0bd4b2   el-1-geth-lighthouse                             engine-rpc: 8551/tcp -> 127.0.0.1:32954       RUNNING
                                                                metrics: 9001/tcp -> 127.0.0.1:32953
                                                                rpc: 8545/tcp -> http://127.0.0.1:32956
                                                                tcp-discovery: 30303/tcp -> 127.0.0.1:32952
                                                                udp-discovery: 30303/udp -> 127.0.0.1:32800
                                                                ws: 8546/tcp -> 127.0.0.1:32955
2bf7b0bc2d4e   validator-key-generation-cl-validator-keystore   <none>                                        RUNNING

@tedim52
Copy link
Contributor

tedim52 commented Jun 19, 2024

Can you provide the output of docker network inspect on the containers associated with the enclave (api-container and el-geth-lighthouse)

@h4ck3rk3y
Copy link
Collaborator

Also, can you try running this https://github.com/kurtosis-tech/postgres-package/blob/main/main.star#L122 package and seeing if you get something similar? It seems like there is something wrong with docker x kurtosis networking and i think your APIC can't reach the container,

If the postrges-package doesn't work can you set wait=None on that line and run using kurtosis run .
https://docs.kurtosis.com/api-reference/starlark-reference/port-spec/ <- use this for docs on wait

If that works then I'd try

docker exec -it core_container sh
ping postgres

@snail-inO
Copy link
Author

Also, can you try running this https://github.com/kurtosis-tech/postgres-package/blob/main/main.star#L122 package and seeing if you get something similar? It seems like there is something wrong with docker x kurtosis networking and i think your APIC can't reach the container,

If the postrges-package doesn't work can you set wait=None on that line and run using kurtosis run . https://docs.kurtosis.com/api-reference/starlark-reference/port-spec/ <- use this for docs on wait

If that works then I'd try

docker exec -it core_container sh
ping postgres

Thanks for the advice.

I can successfully run this package, here are the logs:

INFO[2024-06-19T23:02:37-04:00] Creating a new enclave for Starlark to run inside...
INFO[2024-06-19T23:02:39-04:00] Enclave 'dusty-fjord' created successfully

Container images used in this run:
> postgres:alpine - locally cached

Adding service with name 'postgres' and image 'postgres:alpine'
Service 'postgres' added with service UUID '494e1a842d9e4e6cb75215abcf0218a7'

Starlark code successfully run. Output was:
{
	"database": "postgres",
	"max_cpu": 1000,
	"max_memory": 1024,
	"min_cpu": 10,
	"min_memory": 32,
	"node_selectors": {},
	"password": "MyPassword1!",
	"port": {
		"application_protocol": "postgresql",
		"number": 5432,
		"transport_protocol": "TCP",
		"url": "postgresql://postgres:5432",
		"wait": "2m0s"
	},
	"service": {
		"hostname": "postgres",
		"ip_address": "172.16.4.5",
		"name": "postgres",
		"ports": {
			"postgresql": {
				"application_protocol": "postgresql",
				"number": 5432,
				"transport_protocol": "TCP",
				"url": "postgresql://postgres:5432",
				"wait": "2m0s"
			}
		}
	},
	"url": "postgresql://postgres:MyPassword1!@postgres/postgres",
	"user": "postgres"
}

⭐ us on GitHub - https://github.com/kurtosis-tech/kurtosis
INFO[2024-06-19T23:02:45-04:00] ====================================================
INFO[2024-06-19T23:02:45-04:00] ||          Created enclave: dusty-fjord          ||
INFO[2024-06-19T23:02:45-04:00] ====================================================
Name:            dusty-fjord
UUID:            0a52f2ecf98b
Status:          RUNNING
Creation Time:   Wed, 19 Jun 2024 23:02:37 EDT
Flags:

========================================= Files Artifacts =========================================
UUID   Name

========================================== User Services ==========================================
UUID           Name       Ports                                                  Status
494e1a842d9e   postgres   postgresql: 5432/tcp -> postgresql://127.0.0.1:32963   RUNNING

@snail-inO
Copy link
Author

Can you provide the output of docker network inspect on the containers associated with the enclave (api-container and el-geth-lighthouse)

Sure, here is the info:

Network kt-sparse-plateau

Connected with el-1-geth-lighthouse and kurtosis-api.

[
    {
        "Name": "kt-sparse-plateau",
        "Id": "dd4fc8f5206179e931a37f136ff035d0e67c2d1ea486f78649f53408810a1853",
        "Created": "2024-06-19T23:06:33.939875495-04:00",
        "Scope": "local",
        "Driver": "bridge",
        "EnableIPv6": false,
        "IPAM": {
            "Driver": "default",
            "Options": null,
            "Config": [
                {
                    "Subnet": "172.16.0.0/22",
                    "Gateway": "172.16.0.19"
                }
            ]
        },
        "Internal": false,
        "Attachable": true,
        "Ingress": false,
        "ConfigFrom": {
            "Network": ""
        },
        "ConfigOnly": false,
        "Containers": {
            "079cdbb4dd032d724ea701ab266305edc0caaca9981acd23fcd676083a33d758": {
                "Name": "kurtosis-api--1287cfb89ccb434190e18a1e91e084cc",
                "EndpointID": "afc6fa8a054ac74c28130905ba60b791ac1f1e4a1304d6c5e5d9fe3054e32d8f",
                "MacAddress": "02:42:ac:10:00:03",
                "IPv4Address": "172.16.0.3/22",
                "IPv6Address": ""
            },
            "ab6994396641bcc313e3378b28857ad75cb4445942adfebc41aad26d1c2805c4": {
                "Name": "validator-key-generation-cl-validator-keystore--808a282644b94183bf288e87b7cae7ff",
                "EndpointID": "2832c135bfafa4f0b1844486a74c2d99b82d15a6b837719246bf39371244bfed",
                "MacAddress": "02:42:ac:10:00:04",
                "IPv4Address": "172.16.0.4/22",
                "IPv6Address": ""
            },
            "ef22c830a3f57a42c5f52814b83703b561f70d679bdbebe8f4ead86b1b37f741": {
                "Name": "kurtosis-reverse-proxy--c2cec3b1bb904b0ab4e6c0e94a6945a9",
                "EndpointID": "86b02e714a047768ef8749fd5c3e5f22525c6fd6b8fb507c0f4ed1c51d4ab593",
                "MacAddress": "02:42:ac:10:00:01",
                "IPv4Address": "172.16.0.1/22",
                "IPv6Address": ""
            },
            "f722c8686c673de7c66f1187ce951ed4484bb6f7cbca165990536a43603eb5b9": {
                "Name": "kurtosis-logs-collector--1287cfb89ccb434190e18a1e91e084cc",
                "EndpointID": "f2a65a2f4a9c4cc0cafa8e6ecb3adb9bb3e776046a0620521a14e7bff880d311",
                "MacAddress": "02:42:ac:10:00:02",
                "IPv4Address": "172.16.0.2/22",
                "IPv6Address": ""
            },
            "fd2ae1a69a863c91c54efd6840171f961258dfee376f1734bb0af498ce00dda9": {
                "Name": "el-1-geth-lighthouse--8d5ac9c9f70c48a1af7a5930d8866ff2",
                "EndpointID": "3d2c1286630a9f10babad1513e946233b5e2a88ab2235f30e214ebae1c64145c",
                "MacAddress": "02:42:ac:10:00:0a",
                "IPv4Address": "172.16.0.10/22",
                "IPv6Address": ""
            }
        },
        "Options": {
            "com.docker.network.driver.mtu": "1440"
        },
        "Labels": {
            "com.kurtosistech.app-id": "kurtosis",
            "com.kurtosistech.enclave-creation-time": "2024-06-20T03:06:33Z",
            "com.kurtosistech.enclave-id": "1287cfb89ccb434190e18a1e91e084cc",
            "com.kurtosistech.enclave-name": "sparse-plateau",
            "com.kurtosistech.guid": "1287cfb89ccb434190e18a1e91e084cc",
            "com.kurtosistech.id": "1287cfb89ccb434190e18a1e91e084cc",
            "enclave_uuid": "1287cfb89ccb434190e18a1e91e084cc",
            "service_name": "1287cfb89ccb434190e18a1e91e084cc",
            "service_short_uuid": "1287cfb89ccb",
            "service_uuid": "1287cfb89ccb434190e18a1e91e084cc"
        }
    }
]

Network bridge

Connected with kurtosis-api.

[
    {
        "Name": "bridge",
        "Id": "eef93f41c602a1be2d723d801c1ec4cc5a91e3c98649fdf441d0e6d0af4e4eb1",
        "Created": "2024-06-13T21:03:41.302490691-04:00",
        "Scope": "local",
        "Driver": "bridge",
        "EnableIPv6": false,
        "IPAM": {
            "Driver": "default",
            "Options": null,
            "Config": [
                {
                    "Subnet": "172.17.0.0/16",
                    "Gateway": "172.17.0.1"
                }
            ]
        },
        "Internal": false,
        "Attachable": false,
        "Ingress": false,
        "ConfigFrom": {
            "Network": ""
        },
        "ConfigOnly": false,
        "Containers": {
            "079cdbb4dd032d724ea701ab266305edc0caaca9981acd23fcd676083a33d758": {
                "Name": "kurtosis-api--1287cfb89ccb434190e18a1e91e084cc",
                "EndpointID": "7f2eaa3cd20bc958b21c801073679fdb84d5ba8630460dee29b2a27222f39393",
                "MacAddress": "02:42:ac:11:00:06",
                "IPv4Address": "172.17.0.6/16",
                "IPv6Address": ""
            },
            "606ee07dd368ab4f65ae969d31feaa42e3378b53b6ba2a293510ec1106d5dcd2": {
                "Name": "kurtosis-logs-aggregator",
                "EndpointID": "f6a0484f12161c79dda1c92e1997a8c985690bbf3601c4bca03f4756c39d1703",
                "MacAddress": "02:42:ac:11:00:02",
                "IPv4Address": "172.17.0.2/16",
                "IPv6Address": ""
            },
            "ef22c830a3f57a42c5f52814b83703b561f70d679bdbebe8f4ead86b1b37f741": {
                "Name": "kurtosis-reverse-proxy--c2cec3b1bb904b0ab4e6c0e94a6945a9",
                "EndpointID": "60c646e2c2b16ddbdee905703025951e48718a6a5715b9d0e260c75afc8d7bcc",
                "MacAddress": "02:42:ac:11:00:03",
                "IPv4Address": "172.17.0.3/16",
                "IPv6Address": ""
            },
            "f722c8686c673de7c66f1187ce951ed4484bb6f7cbca165990536a43603eb5b9": {
                "Name": "kurtosis-logs-collector--1287cfb89ccb434190e18a1e91e084cc",
                "EndpointID": "02ccb397ac61f76c00840beee0cc525c9d87ae7f3db004330f53ec3e687a2e62",
                "MacAddress": "02:42:ac:11:00:05",
                "IPv4Address": "172.17.0.5/16",
                "IPv6Address": ""
            },
            "f78220ac740656b77f151be1fb5e0b7deaff09d476a66b1eff7ab68e6d1cf576": {
                "Name": "kurtosis-engine--c2cec3b1bb904b0ab4e6c0e94a6945a9",
                "EndpointID": "360d853ecdd995efdd34fd2bf9c7c394071bc1c6099f61a02468aa96bc200b8b",
                "MacAddress": "02:42:ac:11:00:04",
                "IPv4Address": "172.17.0.4/16",
                "IPv6Address": ""
            }
        },
        "Options": {
            "com.docker.network.bridge.default_bridge": "true",
            "com.docker.network.bridge.enable_icc": "true",
            "com.docker.network.bridge.enable_ip_masquerade": "true",
            "com.docker.network.bridge.host_binding_ipv4": "0.0.0.0",
            "com.docker.network.bridge.name": "docker0",
            "com.docker.network.driver.mtu": "1500"
        },
        "Labels": {}
    }
]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants