-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Iterate over result in bash #503
Comments
I'm not sure I understood what you want to do. |
I want to iterate in bash over a list inside JSON. Full output of first curl command:
I want in bash to do equivalent of:
|
You could do something like for key in $(curl -Ss http://myriak.server:8098/buckets/mybucket/index/index_int/100/200/ | jq -r '.keys[] | @uri'); do
curl http://myriak/buckets/mybucket/keys/$key | jq '.[] | {.id}';
done At least until some kind of |
@wtlangford thanks! Problem with this approach -- it includes double-quotes from both sides of a key, so you end up doing something like this inside a loop:
Do you have an advice (apart from dirty hacks which I currently do :) ) how to solve this nicely? Thanks! |
@wtlangford I mean, current solution from me is to use:
inside loop. This strips off the doublequotes. So, maybe that's ok for me. |
@k-bx - The "-r" option that @wtlangford used strips off the unwanted outer double-quotation marks, so I think the following may be closer to what you are looking for if the keys have non-alphanumeric characters:
|
@pkoppstein @wtlangford ah, I totally missed the |
@pkoppstein I definitely like the use of Also, I just took the chance to look at |
@wtlangford I think it should be Intention was to only select id field of objects. p.s.: what would be even cooler -- if you'd tell me a good way to also loop through these id fields of result-objects |
The following variation only invokes jq twice:
|
@pkoppstein just to confirm it'll work -- second curl returns json list also (not object) |
@k-bx jq will read as many JSON tests on stdin as you feed it, not just one. So the |
I've included a similar (but tested) script at the jq Cookbook -- see emit-the-ids-of-json-objects-in-a-riak-database. |
Thanks @pkoppstein . I've added a note to the FAQ inviting users to edit the wiki. |
Fantastic! Thank you, all my problems regarding riak-querying are now solved. Just played with it a bit -- everything works great. This (and a fact that you can use riak's streaming API) opens so many opportunities, like now it's easy to do stuff like "count-by-unique json field value" and others. Thanks again! |
@k-bx It's really cool that Riak uses sequences of JSON text for streaming. Thanks for pointing that out! And it's wonderful that jq just works with that. |
I'm trying to use every Key,Value of an output and pipe it to another command. INSTANCE_ID=$(curl http://169.254.169.254/latest/meta-data/instance-id) aws ec2 describe-tags --filters "Name=resource-id,Values=$INSTANCE_ID" With the above command, I have the following output:
Now I want to pipe each Key,Value to the following command:
Where the quantity and values of Key,Value are variable. So I believe I need a "for each". May you help me? It's like: For each Key,Value, do: aws ec2 create-tags --resources XXXXX --tags Key=A-KEY,Value=A-VALUE aws ec2 create-tags --resources XXXXX --tags Key=B-KEY,Value=B-VALUE aws ec2 create-tags --resources XXXXX --tags Key=C-KEY,Value=C-VALUE aws ec2 create-tags --resources XXXXX --tags Key=N...-KEY,Value=N...-VALUE |
@thiagodolabella In the future, please send usage questions to StackOverflow's JQ tag, and avoid reusing existing issues for unrelated questions. One way to do this would be to have
|
Hi! So, I'm using Riak indexes, which looks something like this:
Obvious task is to iterate over this "keys" list later in order to actually query for values and filter them with jq.
I just did this https://gist.github.com/k-bx/019fa8429b3634acd0c0 , but it looks ugly (stripping out N symbols, splitting by comma, removing
"
's from keys.Would be really cool if jq would be able to do that for me.
Thanks!
The text was updated successfully, but these errors were encountered: