Skip to content

Take breaks when loading large amounts of data#818

Merged
subdavis merged 2 commits intomainfrom
bugfix/load-crash-big-data
Jul 1, 2021
Merged

Take breaks when loading large amounts of data#818
subdavis merged 2 commits intomainfrom
bugfix/load-crash-big-data

Conversation

@subdavis
Copy link
Contributor

This is a really obvious intermediary fix for issues where the page locks up during data load that we should have added months ago.

  • Add a percentage progress bar to show how much of the dataset is loaded
  • Take 500ms breaks every 4000 track loads to yeild CPU time to other scheduled tasks on the page during load time. This will prevent the browser from trying to kill the page for "not responding" as can happen with really large amounts of data. For absurdly large amounts of data, it also provides much better user confidence that something is happening.

I picked the batch and sleep numbers somewhat arbitrarily because the vuetify progress spinner is kinda buggy and runs its transitions in JS land rather than in CSS. It was implemented such that the transitions stack and can get behind, but that's a whole other thing.

@subdavis subdavis requested a review from BryonLewis June 28, 2021 15:16
Copy link
Collaborator

@BryonLewis BryonLewis left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This works really well and is better than Chrome having a fit when loading large data. I want to get this added in any state just have a couple of questions, their solutions don't need to be implemented now.

Do we want to disable User Interaction during the loading process. I can delete stuff while it is in the middle of loading?
Should time taken be used instead of number of tracks. Corner Case: if you have 8 hour video and 100 tracks that all have polygons and thousands of attributes on them and happen to be the full 8 hour length so the track file is large but not numerous. The percentage loaded will always go from 0 to 100% even though it may take a long time. I realize that obviously updating the progress using the number is the easiest thing to do now.

As an aside I updated my generateLargeDataset.py to do somee of these tests. Mostly making it so I can create 1,000,000 frame videos that could be loaded (loading a million images even locally is painful). I'll update that script in another PR to provide that additional functionality.

v-else
indeterminate
:indeterminate="progress.progress < 0"
:value="Math.round(progress.progress / progress.total * 100)"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we rotate=-90 this. I find the default position weird.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure.

class="main-progress-linear"
>
Loading
<span v-if="progress.progress < 0">Loading</span>
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

feel free to ignore: This line for some reason is hard for my brain to process quickly. I think it's the progress < 0">Loading</span> Think we could force this in the formatter onto a new line?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also sure.

const tracks = Object.values(trackData);
progress.total = tracks.length;
for (let i = 0; i < tracks.length; i += 1) {
if (i % 4000 === 0) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want to add a && i >= 4000 to this because if not for small datasets it does this quick transition between loading and 0% then everything is loaded.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also also sure.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed this in a different way in 15b52cf

@subdavis
Copy link
Contributor Author

Do we want to disable User Interaction during the loading process.

Probably, but this is tied to #727 which is now a priority issue. Once we have a single variable to control the no-user-interaction mode, we can enable it during load.

I can delete stuff while it is in the middle of loading?

Solved by above, I think.

Should time taken be used instead of number of tracks. Corner Case: if you have 8 hour video and 100 tracks that all have polygons and thousands of attributes on them and happen to be the full 8 hour length so the track file is large but not numerous. The percentage loaded will always go from 0 to 100% even though it may take a long time. I realize that obviously updating the progress using the number is the easiest thing to do now.

This is an interesting idea. You could take 500ms break for every second of loading. There's another benefit to this in that it scales with load better. So if things go quickly at first then slow down (because you have slow hardware and memory usage is getting close to the limit) the chunks of processed data would get smaller but you'd have lower risk of page becoming unresponsive.

I'm going to try it.

@subdavis
Copy link
Contributor Author

subdavis commented Jul 1, 2021

I tried it and I didn't like the time-based version. I did a slapdash version and it felt less responsive.

@subdavis subdavis requested a review from BryonLewis July 1, 2021 13:58
@BryonLewis
Copy link
Collaborator

Just curiosity, was it the code implementation or the user experience of the time version that you didn't like?

@subdavis subdavis force-pushed the bugfix/load-crash-big-data branch from 15b52cf to 06d5375 Compare July 1, 2021 14:09
@subdavis
Copy link
Contributor Author

subdavis commented Jul 1, 2021

It wasn't updating in a predictable way (every second). I'm not sure why.

@subdavis subdavis merged commit 11429bc into main Jul 1, 2021
@subdavis subdavis deleted the bugfix/load-crash-big-data branch July 1, 2021 15:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants