Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Offline download memory usage #1167

Closed
danmo opened this issue Dec 4, 2017 · 13 comments
Closed

Offline download memory usage #1167

danmo opened this issue Dec 4, 2017 · 13 comments
Assignees
Labels
status: archived Archived and locked; will not be updated type: bug Something isn't working correctly
Milestone

Comments

@danmo
Copy link

danmo commented Dec 4, 2017

Hi,

I'm using the latest shaka player version and using the Offline storage feature to download and playback content.

My issue is that while downloading memory consumption grows continuously, it seems that shaka is keeping everything in memory. Is there a way to limit memory consumption at a maximum value?

@joeyparrish
Copy link
Member

@danmo, what browser are you using and how are you checking memory consumption?

I have run Chrome out of memory before by leaving the JavaScript console open. When the network panel in the console is recording traffic, it will keep every media segment in memory (so the developer can scroll back and examine them). It's an easy mistake to make.

@danmo
Copy link
Author

danmo commented Dec 4, 2017

@joeyparrish I'm using Chrome on Win 7 32bit and using task manager to monitor memory consumption. Dev tools are closed.

@danmo
Copy link
Author

danmo commented Dec 5, 2017

Hi @joeyparrish , this one is a pretty big blocker for us, any advice on how it can be resolved?

Reproducing it is easy, for example, you can start downloading a 4k video in the demo app, you'll see the tab memory usage increasing.

Bellow are some screen grabs of a heap snapshot I did:
image
image
image
image
image
image

@joeyparrish
Copy link
Member

I'm sorry. We do our best, but we can't always triage and fix bugs in only a day or two. We'll get to this as soon as we can.

@michellezhuogg michellezhuogg added type: bug Something isn't working correctly and removed needs triage labels Dec 7, 2017
@joeyparrish joeyparrish self-assigned this Dec 7, 2017
@joeyparrish
Copy link
Member

Taking a look now

@joeyparrish
Copy link
Member

Here's what I did:

  1. Open latest release (v2.2.8) and select "Sintel 4k (MP4 only)". The MP4-only version is larger than the WebM version, so this should exacerbate any leak by fetching more data.
  2. Open Chrome dev tools and select the "Memory" tab
  3. Click the "take heap snapshot" button (gray circle in the top-left of the panel)
  4. After the snapshot is done, click the "Store" button in the demo app
  5. During storage, take additional heap snapshots (10% progress, 20%, ... 90%)
  6. Wait for storage to complete
  7. Click the "take heap snapshot" button again

The pre-storage snapshot in my case showed 8.8MB memory in use. The post-storage snapshot showed the same. On disk, the stored content is about 176MB.

The in-progress snapshots show increased memory usage during storage. At 10% progress, this was about 26.3MB. At 20% progress, this was about 42.7MB. At 50% it was about 92.9MB. At 90% it was about 147MB. The bulk of this memory is used by the "JSArrayBufferData" type.

During storage, we seem to be using a lot of memory. After storage, the memory is freed. So it does not appear to be an actual leak, but we could do better about freeing memory as we commit it to the database.

@joeyparrish joeyparrish added this to the v2.3.0 milestone Dec 7, 2017
@joeyparrish
Copy link
Member

@vaage, FYI, since you are working on other storage-related issues. I will try to tackle this for v2.2.x if possible.

@joeyparrish
Copy link
Member

Similar results from the latest nightly build.

@joeyparrish
Copy link
Member

Although this is not a leak, we are temporarily keeping all segment data in memory during the storage operation, which is pretty bad in its own right.

It will probably take me a few days to publish a fix, because I need to write a test to go along with it. The test is the hard part by far. The storage system is in the middle of a serious refactor, so a test is extra important right now. We must be careful not to regress after fixing it.

In the mean time, you can apply this small change to your own deployment to fix the issue:

In lib/offline/download_manager.js, where you see this:

        this.storedSegments_.push(segment.segmentDb.key);
        segment.segmentDb.data = response.data;
        return this.storageEngine_.insertSegment(segment.segmentDb);
      }.bind(this))
      .then(function() {
        if (!this.manifest_) {
          return Promise.reject(new shaka.util.Error(

Add one line to the Promise chain after insertSegment() is complete:

        this.storedSegments_.push(segment.segmentDb.key);
        segment.segmentDb.data = response.data;
        return this.storageEngine_.insertSegment(segment.segmentDb);
      }.bind(this))
      .then(function() {
        // vvvvvvv
        segment.segmentDb.data = null;   // <--- ADD THIS
        // ^^^^^^^

        if (!this.manifest_) {
          return Promise.reject(new shaka.util.Error(

This will relieve memory pressure during storage, and should unblock your launch. We expect to have a more complete fix out early next week.

@joeyparrish joeyparrish changed the title Offline download memory leaks Offline download memory usage Dec 7, 2017
@joeyparrish
Copy link
Member

I have been unable to create a regression test for this, so we will publish a fix without one.

Here are the things I explored in my attempts to write an automated test in JavaScript:

  • WeakMap. This doesn't help, because it only makes weak references to keys, not values, and the keys cannot be iterated, counted, or listed. So we could keep weak references to ArrayBuffers in a test so that the test does not prevent garbage collection, but we would have no API to explore that map and see which buffers or how many buffers were still alive.
  • performance.memory. This doesn't help because the actual data buffer in an ArrayBuffer does not seem to count toward the statistics there. After allocating 30MB of ArrayBuffers in a test, the stats do not move.
  • Exploring object hierarchy. This does not help, because DownloadManager does not actually hold the references to the ArrayBuffers directly. Instead, the references are held by an array which is held by closures which are held by DownloadManager. So there would be no way to iterate through private members of Storage and ultimately locate ArrayBuffer references.

The fix should be out today or Monday (pending code review internally), and it will be cherry-picked for the v2.2.9 release next week.

joeyparrish added a commit that referenced this issue Dec 8, 2017
Although DownloadManager clears the segment list as soon as it has
started the download Promise chain, the entire list is bound into the
each download function.  This is subtle and difficult to address
directly.

Instead, we set the segment data to null explicitly after the segment
is stored.  In this way, we are sure to only have one segment buffer
in memory at a time during a storage operation.

Note that there is no regression test to go along with this, because
ArrayBuffer memory consumption cannot be directly instrumented from
JavaScript, weak references cannot be used to track which buffers are
garbage collected, and the buffer references are bound into anonymous
functions and therefore cannot be found by exploring object
hierarchies.

Closes #1167

Backported to v2.2.x

Change-Id: Ifeccbfb2d15a1a0243524c92e36512f9308fd5c6
@joeyparrish
Copy link
Member

The fix has been cherry-picked for v2.2.9 and is available in the nightly build now: https://nightly-dot-shaka-player-demo.appspot.com/demo/

Please let us know if you have any questions.

@danmo
Copy link
Author

danmo commented Dec 9, 2017

@joeyparrish many thanks, the provided solution has fixed the memory issue!

@joeyparrish
Copy link
Member

Glad to hear it! Please let us know if you need anything else.

@shaka-project shaka-project locked and limited conversation to collaborators Mar 22, 2018
@shaka-bot shaka-bot added the status: archived Archived and locked; will not be updated label Apr 15, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
status: archived Archived and locked; will not be updated type: bug Something isn't working correctly
Projects
None yet
Development

No branches or pull requests

4 participants