Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dataset: formulate plan of attack #7

Closed
JohnTigue opened this issue Oct 14, 2019 · 3 comments
Closed

Dataset: formulate plan of attack #7

JohnTigue opened this issue Oct 14, 2019 · 3 comments
Assignees
Projects

Comments

@JohnTigue
Copy link
Contributor

JohnTigue commented Oct 14, 2019

This work is implemented in a Jupyter notebook, challenge_dataset.ipynb.

  • The bucket where the dataset is stored is called brightfield-auto-reconstruction-competition.
  • Total dataset is about 2 terabyte of data.
  • Each neuron's data is between ~6GB and ~60GB
  • There is no index file in each neuron's data directory, just:
    • one SWC (except TEST_DATA_SET)
    • a series of .tif files

Google Colab's allocated file system is 50G, so pulling down one neuron at a time will probably work.

@JohnTigue JohnTigue self-assigned this Oct 14, 2019
@JohnTigue JohnTigue added this to In progress in Jupyter UI Oct 14, 2019
@JohnTigue
Copy link
Contributor Author

Smallest training image stacks: there are 7 image stacks which are smaller than 10 GB. Those are:

  • 651806289/: 291 files = 6.0 GB
  • 647289876/: 228 files = 7.0 GB
  • 651748297/: 336 files = 7.0 GB
  • 647244741/: 261 files = 8.0 GB
  • 713686035/: 289 files = 8.9 GB
  • 647247980/: 299 files = 9.1 GB
  • 649052017/: 307 files = 9.4 GB

@JohnTigue
Copy link
Contributor Author

Testing neurons mapped: 10
Sorted by size of image stack:

  • 665856925/: 281 files = 8.6 GB
  • 715953708/: 340 files = 10.4 GB
  • 751017870/: 465 files = 18.9 GB
  • 687730329/: 497 files = 20.3 GB
  • 850675694/: 438 files = 23.5 GB
  • 827413048/: 424 files = 28.3 GB
  • 761936495/: 529 files = 28.5 GB
  • 691311995/: 441 files = 29.4 GB
  • 741428906/: 591 files = 39.4 GB
  • 878858275/: 541 files = 54.0 GB

@JohnTigue JohnTigue changed the title Dataset: summarize Dataset: formulate plan of attack Oct 15, 2019
@JohnTigue
Copy link
Contributor Author

So, using smallest as the one to reconstruct first: 651806289 291 files i.e. 6.0 GB.

@JohnTigue JohnTigue moved this from In progress to Done in Jupyter UI Oct 15, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Jupyter UI
  
Done
Development

No branches or pull requests

1 participant