The project uses data generated by the Cern opendata offline system, that analyses physics objects.Instructions how to run the examples in order to generate analysis files can be found in the links above. To use the data I generate a json file that contains the analysis of the physics object, the json file then is read in SuperCollider1 that sonifies the data.
The Json file is read using the "api" quark found here api quark
The project is divided in two parts, offline using generated data using the OpenData system, and the online part which uses a GUI that represents the real time data.
Some ideas and considerations to be followed in order to obtain the streaming data: Of a great importance is the implementation of the mapping. Thus, a strategy would require instead of selecting the bits of the data (as happens in the piano piece), is how to map it, however, this will make the mapping process crucial.
-
Model a stream of the data from an offline file.
-
Make a selection interface, ideally that works with both a stream or a whole set of data.
-
Explore mapping strategies for the data, testing with both manual selection and stream in. The latter will probably require some kind of filter, as I think even generating micro sound it will be too dense.
-
Explore selection strategies that use an event filtering approach rather than manual selection. This is interesting to elaborate, particularly in the case of adopting machine listening capabilities in the system...
Footnotes
-
SuperCollider is a programming language for sound synthesis and algorithmic composition. ↩