Reads and writes JSON to and from various input sources.
Reading can be done from a string using one of the Input
implementations:
// Strings
$value= Json::read('"Test"');
// Input
$in= new FileInput(new File('input.json'));
$in= new StringInput('{"Hello": "World"}');
$in= new StreamInput(new SocketInputStream(...));
$value= Json::read($in);
Writing can be done to a string or using one of the Output
implementations:
// Strings
$json= Json::of('Test');
// Output
$out= new FileOutput(new File('output.json'));
$out= new StreamOutput(new SocketOuputStream(...));
Json::write($value, $out);
To change the output format, pass a Format
instance to the output's constructor. The formats available are:
DenseFormat($options)
: Best for network I/O, no unsignificant whitespace, default if nothing given and accessible viaFormat::dense($options= ~Format::ESCAPE_SLASHES)
.WrappedFormat($indent, $options)
: Wraps first-level arrays and all objects, uses whitespace after commas colons. An instance of this format using 4 spaces for indentation and per default leaving forward slashes unescaped is available viaFormat::wrapped($indent= " ", $options= ~Format::ESCAPE_SLASHES)
.
The available options that can be or'ed together are:
Format::ESCAPE_SLASHES
: Escape forward-slashes with "" - default behavior.Format::ESCAPE_UNICODE
: Escape unicode with "\uXXXX" - default behavior.Format::ESCAPE_ENTITIES
: Escape XML entities&
,"
,<
and>
. Per default, these are represented in their literal form.
$out= new FileOutput(new File('glue.json'), Format::wrapped());
$out->write([
'name' => 'example/package',
'version' => '1.0.0',
'require' => [
'xp-forge/json' => '^3.0',
'xp-framework/core' => '^10.0'
]
]);
The above code will yield the following output:
{
"name": "example/package",
"version": "1.0.0'",
"require": {
"xp-forge/json": "^3.0",
"xp-framework/core": "^10.0"
}
}
Processing elements sequentially can save you memory and give a better performance in certain situations.
You can use the elements()
method to receive an iterator over a JSON array. Instead of loading the entire source into memory and then returning the parsed array, it will parse one array element at a time, yielding them while going.
$conn= new HttpConnection(...);
$in= new StreamInput($conn->get('/search?q=example&limit=1000')->in());
foreach ($in->elements() as $element) {
// Process
}
If you get a huge object, you can also process it sequentially using the pairs()
method. This will parse a single key/value pair at a time.
$conn= new HttpConnection(...);
$in= new StreamInput($conn->get('/resources/4711?expand=*')->in());
foreach ($in->pairs() as $key => $value) {
// Process
}
To detect the type of the data on the stream (again, without reading it completely), you can use the type()
method.
$conn= new HttpConnection(...);
$in= new StreamInput($conn->get($resource)->in());
$type= $in->type();
if ($type->isArray()) {
// Handle arrays
} else if ($type->isObject()) {
// Handle objects
} else {
// Handle primitives
}
To write data sequentially, you can use the begin()
method and the stream it returns. This makes sense when the source offers a way to read data sequentially, if you already have the entire data in memory, using write()
has the same effect.
$query= $conn->query('select * from person');
$stream= (new StreamOutput(...))->begin(Types::$ARRAY);
while ($record= $query->next()) {
$stream->element($record);
}
$stream->close();
As the Stream
class implements the Closeable interface, it can be used in the with
statement:
$query= $conn->query('select * from person');
with ((new StreamOutput(...))->begin(Types::$ARRAY), function($stream) use($query) {
while ($record= $query->next()) {
$stream->element($record);
}
});
- Performance figures. TL;DR: While slower than the native functionality, the performance overhead is in low millisecond ranges. Using sequential processing we have an advantage both performance- and memory-wise.
- Parsing JSON is a Minefield. This library runs this test suite next to its own.