Skip to content

Commit

Permalink
Podcast: Pre-caching - Schedule activity over an extended period
Browse files Browse the repository at this point in the history
This change adapts the podcast image and 'more info' pre-caching
activity to take place over an extended period of time.

The intention is to avoid overly burdening modestly endowed servers.
Some users may have a considerable number of subscribed feeds (in excess
of 100 have been reported), and to fetch and parse this number of feeds
all at the same time can cause considerable delays (reportedly in the
10s of seconds) when browsing the podcast UI.

This will be particularly noticeable when using the podcast menu after
clearing server caches, or re-installing LMS. In those circumstances all
feeds need to be pre-cached, and this will be attempted on first use of
the podcast menu.

With this change, pre-caching of each feed will take place at 30 second
intervals. Most feeds will only take a few seconds to handle, so that is
a rather conservative interval. But some feeds may contain several
hundred items, which take rather longer to parse. The 30 second interval
should allow most feeds encountered to complete before starting up a new
one. The aim is to gather up the data in a gentle manner.

The scheduling is carried out by firing off a timer for each feed
concerned, at 30 second intervals (Slim::Utils::Timers::setTimer).

No explicit expiry time need be set for the retrieved feed, we will be
caching the data we want from it.
  • Loading branch information
mw9 committed Sep 26, 2021
1 parent d0b9c8c commit cf0e21e
Showing 1 changed file with 28 additions and 10 deletions.
38 changes: 28 additions & 10 deletions Slim/Plugin/Podcast/Plugin.pm
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ use Slim::Utils::Log;
use Slim::Utils::Prefs;
use Slim::Utils::Strings qw(string cstring);
use Slim::Utils::Timers;
use Time::HiRes;

use Slim::Plugin::Podcast::ProtocolHandler;

Expand Down Expand Up @@ -169,6 +170,9 @@ sub handleFeed {
# then existing feeds
my @feeds = @{$prefs->get('feeds')};

# number of feeds needing precaching - used to set schedule
my $fetchCount = 0;

foreach ( @feeds ) {
my $url = $_->{value};
my $image = $cache->get('podcast-rss-' . $url);
Expand All @@ -191,18 +195,32 @@ sub handleFeed {
$cache->set('podcast-rss-' . $url, __PACKAGE__->_pluginDataFor('icon'), '1days');
$cache->set('podcast_moreInfo_' . $url, {}, '1days');

Slim::Formats::XML->getFeedAsync(
sub {
precacheFeedData($url, $_[0]);
},
# retrieve and parse each feed at 30 second intervals to limit loading
# on modestly powered servers
# some users may have 100/200 podcast subscriptions or more, potentially all
# to be handled at the same time
main::DEBUGLOG && $log->is_debug && $log->debug("scheduling pre-cache of: ", $url);
$fetchCount++;
Slim::Utils::Timers::setTimer(
$url,
Time::HiRes::time() + $fetchCount * 30,
sub {
$log->warn("can't get $url RSS feed information: ", $_[0]);
my $url2fetch = shift;
main::DEBUGLOG && $log->is_debug && $log->debug("launching pre-cache of: ", $url2fetch);
Slim::Formats::XML->getFeedAsync(
sub {
precacheFeedData($url2fetch, $_[0]);
main::DEBUGLOG && $log->is_debug && $log->debug("completed pre-cache of: ", $url2fetch);
},
sub {
$log->warn("can't get $url2fetch RSS feed information: ", $_[0]);
},
{
parser => 'Slim::Plugin::Podcast::Parser',
url => $url2fetch,
}
);
},
{
parser => 'Slim::Plugin::Podcast::Parser',
url => $_->{value},
expires => 86400
}
);
}
}
Expand Down

0 comments on commit cf0e21e

Please sign in to comment.