A collection of composable JSON processing commands for the yupsh ecosystem, providing jq-like capabilities with the power and type safety of Go.
The JSON command system consists of:
- JSON Framework (
json/command.go) - Core JSON processing infrastructure - Individual Commands - Independent modules that implement specific JSON operations
- Streaming JSON Support - Process newline-delimited JSON (NDJSON/JSONL)
- Array Processing - Iterate over array elements
- Type-Safe - Leverages Go's type system for compile-time safety
- Composable - Commands can be piped and combined
- Context-Based - Similar to awk's context pattern for structured processing
The framework provides three processing modes:
-
StreamMode (default) - Process newline-delimited JSON
- Each line is a separate JSON object
- Ideal for log files, data streams
-
ArrayMode - Process JSON array elements
- Input is a single JSON array
- Each element is processed individually
-
SingleMode - Process a single JSON value
- Input is one JSON object/value
The Context object provides access to the current JSON value and utilities:
type Context struct {
Value any // Current JSON value
Index int // Current index (arrays) or -1
Key string // Current key (objects) or empty
Variables map[string]any // User-defined variables
IsArray bool // Processing array element
IsObject bool // Processing object
}Key methods:
AsMap()- Convert value to map[string]anyAsArray()- Convert value to []anyGet(path)- Extract value at dot-notation pathVar(name)/SetVar(name, value)- Variable access
Implement custom JSON processors:
type Processor interface {
Begin(ctx *Context) error // Called once before processing
Process(ctx *Context) (any, bool) // Called for each JSON value
End(ctx *Context) (any, error) // Called once after processing
}These commands convert various formats to JSON, allowing them to be processed by JSON commands:
Convert CSV (comma-separated values) to JSON.
// With headers (default)
cmd := fromcsv.FromCsv()
// Without headers (generates col1, col2, etc.)
cmd := fromcsv.FromCsv(fromcsv.WithoutHeader)
// Custom delimiter
cmd := fromcsv.FromCsv(fromcsv.Delimiter('|'))Input CSV:
name,age,city
Alice,30,NYC
Bob,25,LAOutput JSON (newline-delimited):
{"age":"30","city":"NYC","name":"Alice"}
{"age":"25","city":"LA","name":"Bob"}Convert TSV (tab-separated values) to JSON.
cmd := fromtsv.FromTsv()
cmd := fromtsv.FromTsv(fromtsv.WithoutHeader)Convert TOML to JSON.
cmd := fromtoml.FromToml()Input TOML:
title = "Example"
[owner]
name = "Alice"
age = 30Output JSON (formatted):
{
"title": "Example",
"owner": {
"name": "Alice",
"age": 30
}
}Convert YAML to JSON.
cmd := fromyaml.FromYaml()Input YAML:
name: Alice
age: 30
skills:
- Go
- PythonOutput JSON:
{
"name": "Alice",
"age": 30,
"skills": ["Go", "Python"]
}Extract specific fields from JSON objects.
// Extract name and age fields
cmd := pluck.Pluck("name", "age")Input:
{"name":"Alice","age":30,"city":"NYC"}
{"name":"Bob","age":25,"city":"LA"}Output:
{"age":30,"name":"Alice"}
{"age":25,"name":"Bob"}Filter JSON values based on conditions.
// Select objects with age > 28
cmd := selectcmd.Select(selectcmd.FieldMatches("age", func(val any) bool {
return val.(float64) > 28
}))
// Select active users
cmd := selectcmd.Select(selectcmd.FieldEquals("status", "active"))
// Combine conditions
cmd := selectcmd.Select(selectcmd.And(
selectcmd.HasField("age"),
selectcmd.FieldEquals("status", "active"),
))Navigate and extract values at JSON paths.
cmd := path.Path("user.profile.email")Extract object keys.
cmd := keys.Keys()
// {"name":"Alice","age":30} -> ["age","name"]Extract object values.
cmd := values.Values()
// {"name":"Alice","age":30} -> ["Alice",30]Transform each element in an array.
cmd := mapjson.Map(func(ctx *json.Context) any {
obj := ctx.Value.(map[string]any)
obj["age"] = obj["age"].(float64) + 1
return obj
})Flatten nested arrays.
cmd := flatten.Flatten()
// [[1,2],[3,4]] -> [1,2,3,4]Merge multiple JSON objects.
cmd := merge.Merge()
// {"a":1} {"b":2} -> {"a":1,"b":2}Restructure JSON with field mappings.
cmd := reshape.Reshape(map[string]string{
"firstName": "name.first",
"lastName": "name.last",
})Group objects by field values.
cmd := group.GroupBy("status")Perform aggregation operations.
cmd := aggregate.Sum("amount")
cmd := aggregate.Count()
cmd := aggregate.Avg("score")Get unique values/objects.
cmd := unique.Unique()
cmd := unique.UniqueBy("id")Sort arrays or object keys.
cmd := sortjson.SortBy("age")
cmd := sortjson.SortKeys()Reverse array order.
cmd := reverse.Reverse()Remove null/empty values.
cmd := compact.Compact()Output the type of JSON values.
cmd := typejson.Type()
// "hello" -> "string"
// 123 -> "number"
// {"a":1} -> "object"package main
import (
"github.com/yupsh/json/pluck"
"github.com/yupsh/json/select"
selectcmd "github.com/yupsh/json/select"
gloo "github.com/yupsh/framework"
)
func main() {
// Extract name and email fields
cmd := pluck.Pluck("name", "email")
gloo.MustRun(cmd)
}// Read JSON from file
cmd := pluck.Pluck("name", "age")
inputs := yup.Initialize[yup.File, flags]("users.jsonl")
defer inputs.Close()// Convert CSV to JSON, then extract fields
csvData := fromcsv.FromCsv()
// When pipe is available:
// cmd := pipe.Pipe(
// fromcsv.FromCsv(),
// pluck.Pluck("name", "email"),
// )
// Convert YAML config to JSON
yamlConfig := fromyaml.FromYaml()
// Convert TSV data and filter
tsvData := fromtsv.FromTsv()
// Then process with JSON commandsWhen pipe support is added:
// Filter active users, then extract name and email
cmd := pipe.Pipe(
selectcmd.Select(selectcmd.FieldEquals("status", "active")),
pluck.Pluck("name", "email"),
)You can create your own JSON processing commands using the framework:
package mycommand
import (
json "github.com/yupsh/json"
gloo "github.com/yupsh/framework"
)
type myProcessor struct {
json.SimpleProcessor
}
func (p *myProcessor) Process(ctx *json.Context) (any, bool) {
// Your custom logic here
return ctx.Value, true
}
func MyCommand() gloo.Command {
processor := &myProcessor{}
return json.Json(processor)
}While inspired by jq, yupsh JSON commands are:
- Type-safe - Compile-time checking vs runtime
- Composable - Can be used with other yupsh commands
- Extensible - Easy to add custom processors
- Familiar - Uses Go syntax and idioms
Not intended to be a direct jq replacement, but rather to provide similar capabilities within the yupsh ecosystem.
- ✅ JSON Framework
- ✅ StreamMode, ArrayMode, SingleMode processing
- ✅ fromcsv - CSV to JSON
- ✅ fromtsv - TSV to JSON
- ✅ fromtoml - TOML to JSON
- ✅ fromyaml - YAML to JSON
- ✅ pluck - Extract specific fields
- ✅ select - Filter based on conditions
- ⏳ path - Extract at JSON paths
- ⏳ keys - Extract object keys
- ⏳ values - Extract object values
- ⏳ map - Transform array elements
- ⏳ flatten - Flatten arrays
- ⏳ merge - Merge objects
- ⏳ group - Group by field
- ⏳ aggregate - Sum, count, avg, etc.
- ⏳ sort - Sort arrays/keys
- ⏳ unique - Unique values
- ⏳ compact - Remove nulls
- ⏳ type - Output types
MIT License - See LICENSE file in each command directory.