Closed
Description
Hello, I'm encountering a segmentation error when running a simple plan/apply usecase for testing purposes.
Here is an extract of the logs :
package digger
default allow = true
allow = (count(input.planPolicyViolations) == 0)
Running 'digger plan' for project 'layers_group-4_layer-1' (workflow: )
Error while fetching user teams for CI service: failed to list github teams: GET https://api.github.com/orgs/padok-team/teams: 403 Resource not accessible by integration []
WARNING: teams failed to be fetched, passing an empty list instead for access policy checks
DEBUG: passing the following input policy: map[action:digger plan approvals:[] organisation:padok-team planPolicyViolations:[] project:layers_group-4_layer-1 teams:[] user:cterence] ||| text:
package digger
default allow = true
allow = (count(input.planPolicyViolations) == 0)
Using authentication strategy: Terragrunt
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x0 pc=0x19ff873]
goroutine 1 [running]:
github.com/diggerhq/digger/libs/scheduler.(*Job).AuthTerragrunt(0xc0009d47f0)
/home/runner/work/digger/digger/libs/scheduler/aws.go:188 +0x33
github.com/diggerhq/digger/libs/scheduler.(*Job).PopulateAwsCredentialsEnvVarsForJob(0xc0009d47f0)
/home/runner/work/digger/digger/libs/scheduler/aws.go:111 +0xa6
github.com/diggerhq/digger/cli/pkg/digger.run({_, _}, {{0xc000493920, 0x16}, {0xc000493938, 0x16}, {0xc000493950, 0x16}, {0x0, 0x0}, ...}, ...)
/home/runner/work/digger/digger/cli/pkg/digger/digger.go:201 +0x319
github.com/diggerhq/digger/cli/pkg/digger.RunJobs({0xc0009d5560, 0x1, 0x2cccc4a?}, {0x3574380, 0xc00085ab40}, {0x3527e60, 0xc00085ac00}, {0x355ada0, 0x4ec6420}, {0x355ef98, ...}, ...)
/home/runner/work/digger/digger/cli/pkg/digger/digger.go:96 +0x1478
github.com/diggerhq/digger/cli/pkg/spec.RunSpec({{0x0, 0x0}, {0xc0008586e0, 0x10}, {0xc0008586f0, 0xa}, {0x0, 0x0}, {{0xc0008586dc, 0x4}, ...}, ...}, ...)
/home/runner/work/digger/digger/cli/pkg/spec/spec.go:153 +0x186d
main.init.func1(0xc000115e00?, {0x2cc8279?, 0x4?, 0x2cc8229?})
/home/runner/work/digger/digger/cli/cmd/digger/default.go:35 +0x15a
github.com/spf13/cobra.(*Command).execute(0x4e274c0, {0x4ec6420, 0x0, 0x0})
/home/runner/go/pkg/mod/github.com/spf13/cobra@v1.8.1/command.go:989 +0xab1
github.com/spf13/cobra.(*Command).ExecuteC(0x4e277a0)
/home/runner/go/pkg/mod/github.com/spf13/cobra@v1.8.1/command.go:1117 +0x3ff
github.com/spf13/cobra.(*Command).Execute(...)
/home/runner/go/pkg/mod/github.com/spf13/cobra@v1.8.1/command.go:1041
main.main()
/home/runner/work/digger/digger/cli/cmd/digger/main.go:74 +0xe7
Here are the details :
- Repository is private
- Codebase is just some terragrunt config files instantiating a module with random pets, no AWS resources, no provider config, no backend config (state is stored locally and doesn't matter)
My digger.yml
:
generate_projects:
terragrunt_parsing:
parallel: true
createProjectName: true
createWorkspace: true
defaultWorkflow: default
workflows:
default:
plan:
steps:
- init
- plan
- run: echo "Terragrunt generation!"
My digger_workflow.yml
:
name: Digger Workflow
on:
workflow_dispatch:
inputs:
spec:
required: true
run_name:
required: false
run-name: '${{inputs.run_name}}'
jobs:
digger-job:
runs-on: ubuntu-latest
permissions:
contents: write # required to merge PRs
actions: write # required for plan persistence
id-token: write # required for workload-identity-federation
pull-requests: write # required to post PR comments
issues: read # required to check if PR number is an issue or not
statuses: write # required to validate combined PR status
steps:
- uses: actions/checkout@v4
- name: ${{ fromJSON(github.event.inputs.spec).job_id }}
run: echo "job id ${{ fromJSON(github.event.inputs.spec).job_id }}"
- uses: diggerhq/digger@vLatest
with:
digger-spec: ${{ inputs.spec }}
setup-terragrunt: true
setup-aws: false
terragrunt-version: 0.68.13
env:
GITHUB_CONTEXT: ${{ toJson(github) }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
Am I doing something wrong ? I'm puzzled by the fact that even with setup-aws: false
, the stack trace highlights that AWS auth code was run.
Metadata
Metadata
Assignees
Labels
No labels