Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dasel query crashes with go panic #392

Closed
I-Al-Istannen opened this issue Mar 2, 2024 · 8 comments · Fixed by #393
Closed

Dasel query crashes with go panic #392

I-Al-Istannen opened this issue Mar 2, 2024 · 8 comments · Fixed by #393
Assignees
Labels
bug Something isn't working

Comments

@I-Al-Istannen
Copy link

Describe the bug
dasel crashes with panic: reflect.Set: value of type []interface {} is not assignable to type []*dencoding.Map.
Maybe a dupe of #282.

Stacktrace
panic: reflect.Set: value of type []interface {} is not assignable to type []*dencoding.Map

goroutine 1 [running]:
reflect.Value.assignTo({0x7483e0?, 0xc0353721e0?, 0xc02880af88?}, {0x7b95e2, 0xb}, 0x7476a0, 0x0)
	reflect/value.go:3307 +0x289
reflect.Value.Set({0x7476a0?, 0xc028351f80?, 0x16?}, {0x7483e0?, 0xc0353721e0?, 0xc0285af860?})
	reflect/value.go:2260 +0xe6
github.com/tomwright/dasel/v2.Value.Append({{0xc0322d2340, 0xc028351f80, 0x16}, 0xc028706520, 0x0, 0xc0285af860})
	github.com/tomwright/dasel/v2/value.go:295 +0x285
github.com/tomwright/dasel/v2.glob..func3(0x760ce0?, 0x41679a?, {0xd5cfc0?, 0x6?, 0xa61cc0?})
	github.com/tomwright/dasel/v2/func_append.go:26 +0x2ae
github.com/tomwright/dasel/v2.BasicFunction.Run(...)
	github.com/tomwright/dasel/v2/func.go:144
github.com/tomwright/dasel/v2.(*Step).execute(0xc000313810)
	github.com/tomwright/dasel/v2/step.go:30 +0xc3
github.com/tomwright/dasel/v2.(*Context).Next(0xc03530de00)
	github.com/tomwright/dasel/v2/context.go:227 +0x1a5
github.com/tomwright/dasel/v2.(*Context).Run(0x7aa6e0?)
	github.com/tomwright/dasel/v2/context.go:195 +0x29
github.com/tomwright/dasel/v2.(*Context).subSelect(0xc0322b8d80, {0x7aa6e0?, 0xc0285af980?}, {0xc0322e65e0?, 0x2?})
	github.com/tomwright/dasel/v2/context.go:162 +0x67
github.com/tomwright/dasel/v2.glob..func6.1({{0xc0322d2340, 0xc028351f80, 0x16}, 0xc028706380, 0x0, 0xc0285af860}, {{0xc0322e65e0, 0xc}, {0xc0322e65f0, 0xe}})
	github.com/tomwright/dasel/v2/func_equal.go:42 +0x85
github.com/tomwright/dasel/v2.glob..func6(0xc0322b8d80, 0xc0003135e0, {0xc032232da0?, 0x2, 0xa61cc0?})
	github.com/tomwright/dasel/v2/func_equal.go:64 +0x706
github.com/tomwright/dasel/v2.BasicFunction.Run(...)
	github.com/tomwright/dasel/v2/func.go:144
github.com/tomwright/dasel/v2.(*Step).execute(0xc0003135e0)
	github.com/tomwright/dasel/v2/step.go:30 +0xc3
github.com/tomwright/dasel/v2.(*Context).Next(0xc0322b8d80)
	github.com/tomwright/dasel/v2/context.go:227 +0x1a5
github.com/tomwright/dasel/v2.(*Context).Run(0x7aa6e0?)
	github.com/tomwright/dasel/v2/context.go:195 +0x29
github.com/tomwright/dasel/v2.Select({0x7aa6e0?, 0xc0285af890?}, {0x7ffce70d5591?, 0xc000138840?})
	github.com/tomwright/dasel/v2/context.go:117 +0x2a
github.com/tomwright/dasel/v2/internal/command.runSelectCommand(0xc0000c1c28, 0x7b3e8f?)
	github.com/tomwright/dasel/v2/internal/command/select.go:99 +0xbc
github.com/tomwright/dasel/v2/internal/command.selectRunE(0xc00019ac00?, {0xc00035bd80, 0x1, 0x7ad020?})
	github.com/tomwright/dasel/v2/internal/command/select.go:83 +0x4f9
github.com/spf13/cobra.(*Command).execute(0xc000004300, {0xc0000160a0, 0x8, 0x8})
	github.com/spf13/cobra@v1.8.0/command.go:983 +0xabc
github.com/spf13/cobra.(*Command).ExecuteC(0xc000004300)
	github.com/spf13/cobra@v1.8.0/command.go:1115 +0x3ff
github.com/spf13/cobra.(*Command).Execute(0x408e2b?)
	github.com/spf13/cobra@v1.8.0/command.go:1039 +0x13
main.main()
	github.com/tomwright/dasel/v2/cmd/dasel/main.go:10 +0x25

To Reproduce
Steps to reproduce the behavior:

  1. Create the following CSV (or any other, I think):
    Hello;There;
    1;2;
    
  2. Run dasel -f foo.csv --csv-comma ';' -w json --pretty 'equal([], )'
  3. Observe the error

I have to admit I didn't look much at the dasel query syntax (this is a reduced version), I just ended up piping it to jq and using some select and map magic for my usecase :)

Expected behavior
It should not crash.

Desktop:

  • OS: Arch Linux (using the nix package though)
  • Version dasel version 2.6.0
@I-Al-Istannen I-Al-Istannen added the bug Something isn't working label Mar 2, 2024
@TomWright TomWright self-assigned this Mar 2, 2024
@TomWright
Copy link
Owner

Hey @I-Al-Istannen,

Thanks for raising this. I'll do a bit of digging.

Out of interest, what were you actually trying to get from the CSV?

@TomWright
Copy link
Owner

It seems this panic could occur when appending to a list of objects. I have fixed it here: #393 however it requires the deletion of some code that I need to verify is OK to be removed.

@I-Al-Istannen
Copy link
Author

Thanks for raising this. I'll do a bit of digging.

Thanks :)

Out of interest, what were you actually trying to get from the CSV?

I was parsing stop data for the German train network, which of course only offers data in a large CSV file :P The full command for finding stops I was interested in looked something like this:

dasel -r csv -f <(iconv -f ISO-8859-1 -t UTF-8 haltestellen.csv) --csv-comma ';' -w json | jq 'map(select(.Gemeinde | contains("Name"))) | map({Gemeinde, Haltestelle, Haltestelle_lang, globaleID})'

I was trying to do the select with dasel first, but it was sufficiently different from jq that I didn't manage to cook something up quickly and then ran into the crash. dasel+jq worked just fine. Though the above command takes around 7 seconds, with dasel taking up 5 of them, but I don't really care about the performance :D

@TomWright
Copy link
Owner

Yeah dasel unfortunately loads the entire file into memory because of the change in file type. Streaming parsers are hard enough on there own - imagine trying to streaming readers + writers in 5 file types.

Performance aside, I'd be interested in a subset of your file that I can create some tests against to:
A: be sure no panic will occur
B: Potentially help you with your query without the need for jq (I'm always looking for ways to increase the general usefulness of dasel)

@I-Al-Istannen
Copy link
Author

Streaming parsers are hard enough on there own - imagine trying to streaming readers + writers in 5 file types.

:^)


You can find the full file for e.g. Baden-Württemberg here. They offer data in the only two capable data formats this world has come up with, semicolon-separated comma separated values and XLSX. You can follow the "csv Datei" link (or click here directly).

The query is in my last comment, it just takes the json array, filters the entries to the ones in a specific municipality and then trims down the json object to the relevant elements.

@TomWright
Copy link
Owner

Hey @I-Al-Istannen,

Just for your information I got some time to dig at this. This is possible without hitting the panic you experienced:

./dasel -r csv -f <(iconv -f ISO-8859-1 -t UTF-8 ~/Downloads/file.csv) --csv-comma ';' -w json 'all().filterOr(equal(Gemeinde,Stuttgart),equal(Gemeinde,Mahlstetten)).mapOf(Gemeinde,Gemeinde,Haltestelle,Haltestelle,Haltestelle_lang,Haltestelle_lang,globaleID,globaleID)'
6.79s user 0.10s system 164% cpu 4.185 total

This shows just the fields:

  • Gemeinde
  • Haltestelle
  • Haltestelle_lang
  • globaleID

For any stations in:

  • Stuttgart
  • Mahlstetten

I have also merged the panic fix to master.

@I-Al-Istannen
Copy link
Author

I-Al-Istannen commented Mar 14, 2024

Ah, I see. You need to also duplicate all the fields in the map as it is key,value,key,value,…. I was trying more of a jq-like approach for the filter apparently 'equal([].Landkreis, "Sigmaringen")', but that already maps the array values so you do not need to further deal with the outer array here.

@TomWright
Copy link
Owner

That's right - I want a neater version of that map function for exactly this purpose. It works for now though

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants