Skip to content

single process worker#10

Merged
perrygeo merged 5 commits intomasterfrom
hot-tub
Jun 22, 2016
Merged

single process worker#10
perrygeo merged 5 commits intomasterfrom
hot-tub

Conversation

@dnomadb
Copy link
Contributor

@dnomadb dnomadb commented Jun 17, 2016

@perrygeo what are your thoughts on this approach for #9 ? Prob should be less cheeky in naming. :trollface:

@coveralls
Copy link

Coverage Status

Changes Unknown when pulling 27180fe on hot-tub into * on master*.

@coveralls
Copy link

Coverage Status

Changes Unknown when pulling 27180fe on hot-tub into * on master*.

@coveralls
Copy link

coveralls commented Jun 17, 2016

Coverage Status

Changes Unknown when pulling 3369f12 on hot-tub into * on master*.

@coveralls
Copy link

coveralls commented Jun 17, 2016

Coverage Status

Changes Unknown when pulling 68f44ba on hot-tub into * on master*.

@coveralls
Copy link

coveralls commented Jun 18, 2016

Coverage Status

Changes Unknown when pulling 085886a on hot-tub into * on master*.

@dnomadb
Copy link
Contributor Author

dnomadb commented Jun 18, 2016

@perrygeo I am not sure if this will solve our traceback issues, but I rolled in a good number of test improvements. 👀 please?

@dnomadb
Copy link
Contributor Author

dnomadb commented Jun 22, 2016

Tracebacks

With: https://gist.github.com/f4cdc39c4fc5fd07e40513cb39c9b088

With 1 process (ie mocked)

» python test-delete.py 2>&1 | wc -l
      26

With multiprocessing

» python test-delete.py 2>&1 | wc -l
      40

Profiling

With https://gist.github.com/01952c2ead83697b28e6b0c43fafa129

With 1 process (ie mocked)

 » python -m memory_profiler test-delete.py
Filename: test-delete.py

Line #    Mem usage    Increment   Line Contents
================================================
     5   84.594 MiB    0.000 MiB   @profile
     6                             def read_function_simple(data, window, ij, g_args):
     7                                 ## do a ton of unneccesary work
     8   84.594 MiB    0.000 MiB       depth, rows, cols = data[0].shape
     9                             
    10   84.594 MiB    0.000 MiB       return np.mean(
    11   84.594 MiB    0.000 MiB           np.dstack(
    12                                         [
    13  128.156 MiB   43.562 MiB                   data[0][0].copy().astype(np.float64) * np.random.rand(rows, cols) for i in range(100)
    14                                         ]),
    15   84.656 MiB  -43.500 MiB           axis=2).astype(data[0].dtype).reshape(depth, rows, cols)

With 2 processes:

» python -m memory_profiler test-delete.py

(nothing)

@perrygeo
Copy link
Contributor

🚀

@perrygeo perrygeo merged commit 73ad9ab into master Jun 22, 2016
@perrygeo perrygeo deleted the hot-tub branch June 22, 2016 18:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants