Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Use multiple GPUs during predict #5742

Closed
darrylbobo opened this issue Apr 9, 2017 · 2 comments
Closed

Use multiple GPUs during predict #5742

darrylbobo opened this issue Apr 9, 2017 · 2 comments

Comments

@darrylbobo
Copy link

darrylbobo commented Apr 9, 2017

I am trying to extract middle layer activation maps using a trained model.
However, when I set to 2 GPUs, it always just use only 1 GPU, and the other GPU is 0%.

.
From this thread #4968, it said I should set kv-store to "device", instead of "local"

However, since I didn't fit the model to new data, all I used is predict function. Where should I change the kv-store ???

Here are the code:

model_load = mx.model.FeedForward.load(prefix, 0, ctx=[mx.gpu(0), mx.gpu(1)],numpy_batch_size=1)

layer_name = ['relu1_2_output','relu2_2_output','relu3_3_output','relu4_3_output', 'relu5_3_output' , 'relu6_output', 'relu7_output']

all_layers = model_load.symbol.get_internals()
fea_symbol = all_layers[layer_name[2]]

feature_extractor = mx.model.FeedForward( ctx=[mx.gpu(0), mx.gpu(1)], symbol=fea_symbol,
numpy_batch_size=1, arg_params=model_load.arg_params,
aux_params=model_load.aux_params,
allow_extra_params=True)

[val_feature, valdata, vallabel]= feature_extractor.predict(img, return_data=True)

@piiswrong
Copy link
Contributor

use module instead

@szha
Copy link
Member

szha commented Sep 29, 2017

This issue is closed due to lack of activity in the last 90 days. Feel free to ping me to reopen if this is still an active issue. Thanks!

@szha szha closed this as completed Sep 29, 2017
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants