New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Music usage reporting #283

charlesh3 opened this Issue Aug 29, 2017 · 6 comments


None yet
5 participants
Copy link

charlesh3 commented Aug 29, 2017

Is there anyone that can give advice on how to prepare music usage reports to ASCAP-BMI-SESAC-SoundExchange?
All statio0ns need to do this, so I am hoping there is someone with expertise in how to use LibreTime logs as a starting point to produce the needed reports.




This comment has been minimized.

Copy link

Robbt commented Aug 31, 2017

This is functionality that is currently lacking. I know that it was added to after we left. What would be useful is a compilation of the different reporting formats (to the extent that they are publicly documented) so that we can then write code to create the reports.


This comment has been minimized.

Copy link

jerry924 commented Nov 15, 2017

See issue #196 I added some comments there today. I get music reporting I need from the database for track playout and listener stats. I export the data from these two tables during the time range I care about, then load into Excel to create my reports.

It should be easily doable as a built-in report to Libretime using a Range hashmap to store listener stats, then iterate through each track in playout history and map number of listeners to that track.

The only issue I still have is when someone has a composite track that has many songs in it (ie: an MP3 containing an hour of music), or when a DJ live-plays tracks outside of Libretime. Those just become more of manual work.


This comment has been minimized.

Copy link

leonardpg commented Aug 28, 2018

Rather than trying to match things up after the fact, I'd like to create a table in the database that simply stores track id, number of listeners, and number of plays. This table would be updated in real time during a designated two-week reporting period. The reporting period could be configured (on/off) in Libretime.

Then a report could be generated by joining the file info and the reporting period information. Piwik could be used for monthly aggregate tuning hours and joined to a .csv from this report after the .csv is downloaded. Once the report is done, the data from the reporting period table can be purged.

I spent some time grepping the code and I find methods in ApiController.php called pushStreamStatsAction and notifyMediaItemStartPlayAction. My guess is that maybe notifyMediaStartPlayAction can be modified to accept an additional argument, say, "reporting_period", and if it's set to TRUE, call whatever code poles the streaming server for current listeners, and then pass the media id and listener count to an additional method that records these values in the reporting period table.

This is just from an hour of guessing at the source code. I'm sure I'm missing a whole lot of stuff.


This comment has been minimized.

Copy link

Robbt commented Aug 29, 2018

That sounds like it would work as a quick and dirty modification but I think it might make more sense to modify the code so that it always record the information and then be able to query it to generate reports for any period later.

Another benefit to recording the listeners to a track would be the ability for people to look back and see what tracks got the most listens etc.

A couple of decisions would need to be made to kind of approximate the numbers. In general the streaming stats wouldn't reflect people who tuned in for a portion of a song.

Also it would be good to have a way of dialing in metadata for shows that are compilations of many shorter shows. Similar to what you did for WCRS back in the day, but perhaps built into the UI of LibreTime there could be a track annotation interface that allows a track timeline to be setup to designate different artists/songs for different times of the track and also updates the icecast/shoutcast stream as well. This is like a stretch goal that would be nice once the above reporting mechanism is implemented.

Also if there were links to the formats required for the various reporting agencies we could build a framework that creates the reports directly.


This comment has been minimized.

Copy link

hairmare commented Oct 6, 2018

In Switzerland the main requirement for reporting seems to be that we need to supply an ISRC along with more standard fields. Since our playout is not managed by LibreTime in all cases we ended up grabbing the info from the Stream using ACRCloud. The code we are using alongside our ACRCloud subscription is here. It doesn't take listeners into account since those get measured by separate means in Switzerland.

Spinitron seems to use ACRCloud under the hood as well and it seems to be geared more towards the north american market.


This comment has been minimized.

Copy link

Robbt commented Nov 26, 2018

Ok, revisiting this. The information that LibreTime stores isn't perfect and it would be feasible but annoying to attempt to rebuild how many listeners each track has.

From what I can tell notify_media_item_start_playing is a callback from liquidsoap see

def notify_media_item_start_playing(self, media_id):
""" This is a callback from liquidsoap, we use this to notify
about the currently playing *song*. We get passed a JSON string
which we handed to liquidsoap in get_liquidsoap_data(). """
except Exception, e:
return None

And that in turn calls the code above.

Also every 120 seconds the pypo code gathers listener statistics and passes them via API call-

def run(self):
#Wake up every 120 seconds and gather icecast statistics. Note that we
#are currently querying the server every 2 minutes for list of
#mountpoints as well. We could remove this query if we hooked into
#rabbitmq events, and listened for these changes instead.
while True:
stream_parameters = self.get_stream_parameters()
stats = self.get_stream_stats(stream_parameters["stream_params"])
if stats:
except Exception, e:
self.logger.error('Exception: %s', e)
time.sleep(120)'ListenerStat thread exiting')

So the data that is actually stored in the database is kind of fragmented. We have ##cc_playout_history## which is simply all of the tracks played and a start_date and end_data.
We also have ##cc_listener_count## which uses foreign keys to point to ##cc_timestamp## and ##mount_name_id## and contains "listener_count" which it gets from the pushStreamStat API call.
I don't know why they someone decided to create a separate timestamp_id database vs. including the data directly in cc_listener_count. I don't see any benefit.

So yeah it might make more sense to redesign this to capture the data more eloquently in the first place.

Polling for listeners at the beginning of each track seems to make sense and would give us a baseline listener to a song that can be added. Now for longer tracks(aka long compilations) it probably also makes sense to periodically poll for stats over the course of the track.

If the playout_history matched the listener_count timestamp then it could be a lot easier to match them up for determining the number of listeners to a track.
Creating a special reporting module could make sense as well but I'd rather just make LibreTime collect statistics in the most logical way possible.

The daterange hashmap idea by @jerry924 seems like it could be a good idea but integrating it into
the HistoryService.php file would make sense but I think it might require more than just rewriting this SQL query

$fileSummaryTable = "((
SELECT COUNT(history.file_id) as played, history.file_id as file_id
FROM cc_playout_history AS history
WHERE history.starts >= :starts AND history.starts < :ends
AND history.file_id IS NOT NULL
GROUP BY history.file_id
) AS playout
LEFT JOIN cc_files AS file ON ( = playout.file_id)) AS summary";

I haven't looked into Piwik/Matomo's ability to digest IceCast Logs but I know it supports them.

Also Spinitron does have an integration with ACRCloud but it costs an additional 19$ according to this post and they do their pricing in general on a per station basis but I found one station say it cost around 50$ a month. I don't know what price they would give to web only or LPFM stations. I also don't know how much a subscription to ACRCloud itself would cost but they do provide code you can run on your own server and this might be something worth investigating if the price is low enough as my station doesn't actually upload individual music tracks all that often so something that utilizes fingerprinting would make it easier for reporting purposes.

It would still be helpful for us to keep our own records of say how many listeners were listening to a certain show, or listened to a certain file and so I'm going to see about modifying the code to make that work.

@Robbt Robbt added the analytics label Jan 19, 2019

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment