Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ImportError: No module named '_prroi_pooling' #13

Open
adithya-samavedhi opened this issue Aug 25, 2022 · 1 comment
Open

ImportError: No module named '_prroi_pooling' #13

adithya-samavedhi opened this issue Aug 25, 2022 · 1 comment

Comments

@adithya-samavedhi
Copy link

Hi,
I am trying to setup the codebase and run the cells part of dissect_classifier_experiment.ipynb
However, I am running into the following error: ImportError: No module named '_prroi_pooling'

Please find the Stack Trace below:

---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
~/dissect-master/netdissect/upsegmodel/prroi_pool/functional.py in <module>
     21         [pjoin(root_dir, 'prroi_pooling_gpu.c'), pjoin(root_dir, 'prroi_pooling_gpu_impl.cu')],
---> 22         verbose=False
     23     )

/datasets/home/12/112/asamavedhi/anaconda3/envs/dissect_env2/lib/python3.6/site-packages/torch/utils/cpp_extension.py in load(name, sources, extra_cflags, extra_cuda_cflags, extra_ldflags, extra_include_paths, build_directory, verbose, with_cuda, is_python_module)
    920         with_cuda,
--> 921         is_python_module)
    922 

/datasets/home/12/112/asamavedhi/anaconda3/envs/dissect_env2/lib/python3.6/site-packages/torch/utils/cpp_extension.py in _jit_compile(name, sources, extra_cflags, extra_cuda_cflags, extra_ldflags, extra_include_paths, build_directory, verbose, with_cuda, is_python_module)
   1119         print('Loading extension module {}...'.format(name))
-> 1120     return _import_module_from_library(name, build_directory, is_python_module)
   1121 

/datasets/home/12/112/asamavedhi/anaconda3/envs/dissect_env2/lib/python3.6/site-packages/torch/utils/cpp_extension.py in _import_module_from_library(module_name, path, is_python_module)
   1440     # https://stackoverflow.com/questions/67631/how-to-import-a-module-given-the-full-path
-> 1441     file, path, description = imp.find_module(module_name, [path])
   1442     # Close the .so file after load.

/datasets/home/12/112/asamavedhi/anaconda3/envs/dissect_env2/lib/python3.6/imp.py in find_module(name, path)
    296     else:
--> 297         raise ImportError(_ERR_MSG.format(name), name=name)
    298 

ImportError: No module named '_prroi_pooling'

During handling of the above exception, another exception occurred:

ImportError                               Traceback (most recent call last)
<ipython-input-17-5fd935b128e0> in <module>
      6 classlabels = [r.split(' ')[0][3:] for r in urlopen(synset_url).read().decode('utf-8').split('\n')]
      7 classlabels = dataset.classes
----> 8 segmodel, seglabels, segcatlabels = experiment.setting.load_segmenter(args.seg)
      9 renorm = renormalize.renormalizer(dataset, target='zc')

~/dissect-master/experiment/setting.py in load_segmenter(segmenter_name)
     78     segmodels.append(segmenter.UnifiedParsingSegmenter(segsizes=[256],
     79             all_parts=all_parts,
---> 80             segdiv=('quad' if quad_seg else None)))
     81     if textures:
     82         segmenter.ensure_segmenter_downloaded('datasets/segmodel', 'texture')

~/dissect-master/netdissect/segmenter.py in __init__(self, segsizes, segdiv, all_parts)
    141         ensure_segmenter_downloaded('datasets/segmodel', 'upp')
    142         segmodel = load_unified_parsing_segmentation_model(
--> 143             segarch, segvocab, epoch)
    144         segmodel.cuda()
    145         self.segmodel = segmodel

~/dissect-master/netdissect/segmenter.py in load_unified_parsing_segmentation_model(segmodel_arch, segvocab, epoch)
    581         fc_dim=2048, use_softmax=True,
    582         nr_classes=nr_classes,
--> 583         weights=os.path.join(segmodel_dir, 'decoder_epoch_%d.pth' % epoch))
    584     segmodel = upsegmodel.SegmentationModule(
    585         seg_encoder, seg_decoder, labeldata)

~/dissect-master/netdissect/upsegmodel/models.py in build_decoder(self, nr_classes, arch, fc_dim, weights, use_softmax)
    199                 fc_dim=fc_dim,
    200                 use_softmax=use_softmax,
--> 201                 fpn_dim=512)
    202         else:
    203             raise Exception('Architecture undefined!')

~/dissect-master/netdissect/upsegmodel/models.py in __init__(self, nr_classes, fc_dim, use_softmax, pool_scales, fpn_inplanes, fpn_dim)
    255                  fpn_inplanes=(256,512,1024,2048), fpn_dim=256):
    256         # Lazy import so that compilation isn't needed if not being used.
--> 257         from .prroi_pool import PrRoIPool2D
    258         super(UPerNet, self).__init__()
    259         self.use_softmax = use_softmax

~/dissect-master/netdissect/upsegmodel/prroi_pool/__init__.py in <module>
     10 # Copyright (c) 2017 Megvii Technology Limited.
     11 
---> 12 from .prroi_pool import *
     13 

~/dissect-master/netdissect/upsegmodel/prroi_pool/prroi_pool.py in <module>
     12 import torch.nn as nn
     13 
---> 14 from .functional import prroi_pool2d
     15 
     16 __all__ = ['PrRoIPool2D']

~/dissect-master/netdissect/upsegmodel/prroi_pool/functional.py in <module>
     23     )
     24 except ImportError:
---> 25     raise ImportError('Can not compile Precise RoI Pooling library.')
     26 
     27 __all__ = ['prroi_pool2d']

ImportError: Can not compile Precise RoI Pooling library.

Could you please help resolve it?

@adithya-samavedhi
Copy link
Author

Here is the output for conda list:

`(dissect_env2) asamavedhi@dsmlp-jupyter-asamavedhi:~/dissect-master$ conda list

packages in environment at /datasets/home/12/112/asamavedhi/anaconda3/envs/dissect_env2:

Name Version Build Channel

_libgcc_mutex 0.1 main
_openmp_mutex 5.1 1_gnu
_sysroot_linux-64_curr_repodata_hack 3 h5bd9786_13 conda-forge
argon2-cffi 20.1.0 py36h27cfd23_1
async_generator 1.10 py36h28b3542_0
attrs 21.4.0 pyhd3eb1b0_0
backcall 0.2.0 pyhd3eb1b0_0
blas 1.0 mkl
bleach 4.1.0 pyhd3eb1b0_0
ca-certificates 2022.6.15 ha878542_0 conda-forge
certifi 2021.5.30 py36h06a4308_0
cffi 1.14.6 py36h400218f_0
cloudpickle 2.0.0 pyhd3eb1b0_0
cudatoolkit 10.2.89 hfd86e86_1
cudatoolkit-dev 11.4.0 h5e8e339_5 conda-forge
cudnn 7.6.5 cuda10.2_0
cycler 0.11.0 pyhd3eb1b0_0
cytoolz 0.11.0 py36h7b6447c_0
dask-core 2021.3.0 pyhd3eb1b0_0
dbus 1.13.18 hb2f20db_0
decorator 5.1.1 pyhd3eb1b0_0
defusedxml 0.7.1 pyhd3eb1b0_0
entrypoints 0.3 py36_0
expat 2.4.4 h295c915_0
fontconfig 2.13.1 h6c09931_0
freetype 2.11.0 h70c0345_0
glib 2.69.1 h4ff587b_1
gst-plugins-base 1.14.0 h8213a91_2
gstreamer 1.14.0 h28cd5cc_2
icu 58.2 he6710b0_3
imageio 2.9.0 pyhd3eb1b0_0
intel-openmp 2022.0.1 h06a4308_3633
ipykernel 5.3.4 py36h5ca1d4c_0
ipython 7.16.1 py36h5ca1d4c_0
ipython_genutils 0.2.0 pyhd3eb1b0_1
jedi 0.17.0 py36_0
jinja2 3.0.3 pyhd3eb1b0_0
jpeg 9e h7f8727e_0
jsonschema 3.0.2 py36_0
jupyter_client 7.1.2 pyhd3eb1b0_0
jupyter_core 4.8.1 py36h06a4308_0
jupyterlab_pygments 0.1.2 py_0
kernel-headers_linux-64 3.10.0 h4a8ded7_13 conda-forge
kiwisolver 1.3.1 py36h2531618_0
lcms2 2.12 h3be6417_0
ld_impl_linux-64 2.38 h1181459_1
lerc 3.0 h295c915_0
libdeflate 1.8 h7f8727e_5
libffi 3.3 he6710b0_2
libgcc-ng 11.2.0 h1234567_1
libgfortran-ng 7.5.0 ha8ba4b0_17
libgfortran4 7.5.0 ha8ba4b0_17
libgomp 11.2.0 h1234567_1
libpng 1.6.37 hbc83047_0
libsodium 1.0.18 h7b6447c_0
libstdcxx-ng 11.2.0 h1234567_1
libtiff 4.4.0 hecacb30_0
libuuid 1.0.3 h7f8727e_2
libwebp-base 1.2.2 h7f8727e_0
libxcb 1.15 h7f8727e_0
libxml2 2.9.14 h74e7548_0
lz4-c 1.9.3 h295c915_1
markupsafe 2.0.1 py36h27cfd23_0
matplotlib 3.3.4 py36h06a4308_0
matplotlib-base 3.3.4 py36h62a2d02_0
mistune 0.8.4 py36h7b6447c_0
mkl 2020.2 256
mkl-include 2022.0.1 h06a4308_117
mkl-service 2.3.0 py36he8ac12f_0
mkl_fft 1.3.0 py36h54f3939_0
mkl_random 1.1.1 py36h0573a6f_0
nb_conda_kernels 2.3.1 py36h06a4308_0
nbclient 0.5.3 pyhd3eb1b0_0
nbconvert 6.0.7 py36_0
nbformat 5.1.3 pyhd3eb1b0_0
ncurses 6.3 h5eee18b_3
nest-asyncio 1.5.1 pyhd3eb1b0_0
networkx 2.5 py_0
ninja 1.10.2.3 pypi_0 pypi
ninja-base 1.10.2 hd09550d_5
notebook 6.4.3 py36h06a4308_0
numpy 1.19.2 py36h54aff64_0
numpy-base 1.19.2 py36hfa32c7d_0
nvcc_linux-64 11.7 h0fb96c7_21 conda-forge
olefile 0.46 py36_0
openjpeg 2.4.0 h3ad879b_0
openssl 1.1.1l h7f98852_0 conda-forge
packaging 21.3 pyhd3eb1b0_0
pandas 1.1.5 py36ha9443f7_0
pandoc 2.12 h06a4308_0
pandocfilters 1.5.0 pyhd3eb1b0_0
parso 0.8.3 pyhd3eb1b0_0
patsy 0.5.1 py36_0
pcre 8.45 h295c915_0
pexpect 4.8.0 pyhd3eb1b0_3
pickleshare 0.7.5 pyhd3eb1b0_1003
pillow 8.3.1 py36h2c7a002_0
pip 21.2.2 py36h06a4308_0
prometheus_client 0.13.1 pyhd3eb1b0_0
prompt-toolkit 3.0.20 pyhd3eb1b0_0
ptyprocess 0.7.0 pyhd3eb1b0_2
pycparser 2.21 pyhd3eb1b0_0
pygments 2.11.2 pyhd3eb1b0_0
pyparsing 3.0.4 pyhd3eb1b0_0
pyqt 5.9.2 py36h05f1152_2
pyrsistent 0.17.3 py36h7b6447c_0
python 3.6.13 h12debd9_1
python-dateutil 2.8.2 pyhd3eb1b0_0
pytorch 1.5.1 py3.6_cuda10.2.89_cudnn7.6.5_0 pytorch
pytz 2021.3 pyhd3eb1b0_0
pywavelets 1.1.1 py36h7b6447c_2
pyyaml 5.4.1 py36h27cfd23_1
pyzmq 22.2.1 py36h295c915_1
qt 5.9.7 h5867ecd_1
readline 8.1.2 h7f8727e_1
scikit-image 0.17.2 py36hdf5156a_0
scipy 1.5.2 py36h0b6359f_0
sed 4.8 he412f7d_0 conda-forge
send2trash 1.8.0 pyhd3eb1b0_1
setuptools 58.0.4 py36h06a4308_0
sip 4.19.8 py36hf484d3e_0
six 1.16.0 pyhd3eb1b0_1
sqlite 3.39.2 h5082296_0
statsmodels 0.12.2 py36h27cfd23_0
sysroot_linux-64 2.17 h4a8ded7_13 conda-forge
terminado 0.9.4 py36h06a4308_0
testpath 0.5.0 pyhd3eb1b0_0
tifffile 2020.10.1 py36hdd07704_2
tk 8.6.12 h1ccaba5_0
toolz 0.11.2 pyhd3eb1b0_0
torchvision 0.6.1 py36_cu102 pytorch
tornado 6.1 py36h27cfd23_0
traitlets 4.3.3 py36h06a4308_0
wcwidth 0.2.5 pyhd3eb1b0_0
webencodings 0.5.1 py36_1
wheel 0.37.1 pyhd3eb1b0_0
xz 5.2.5 h7f8727e_1
yaml 0.2.5 h7b6447c_0
zeromq 4.3.4 h2531618_0
zlib 1.2.12 h7f8727e_2
zstd 1.5.2 ha4553b6_0`

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant