Skip to content

Commit

Permalink
Don't use parallel_map() when pool_size is 1
Browse files Browse the repository at this point in the history
  • Loading branch information
dahlia committed Nov 15, 2014
1 parent 4ca3bb3 commit c9e1c15
Showing 1 changed file with 4 additions and 0 deletions.
4 changes: 4 additions & 0 deletions libearth/crawler.py
Expand Up @@ -76,6 +76,10 @@ def crawl(feed_urls, pool_size, timeout=DEFAULT_TIMEOUT):
func = get_feed
else:
func = functools.partial(get_feed, timeout=int(timeout))
if pool_size < 1:
raise ValueError('pool_size must be greater than zero')
elif pool_size == 1:
return [func(url) for url in feed_urls]
return parallel_map(pool_size, func, feed_urls)


Expand Down

0 comments on commit c9e1c15

Please sign in to comment.