Join GitHub today
GitHub is home to over 31 million developers working together to host and review code, manage projects, and build software together.Sign up
[Python] Segfault when reading parquet files if pytorch is imported before pyarrow #2637
A self-contained example is available here: https://gitlab.com/ostrokach/pyarrow_pytorch_segfault.
python -X faulthandler -c "import torch; import pyarrow.parquet as pq; _ = pq.ParquetFile('example.parquet').read_row_group(0)"
sooner or later results in a segfault:
Fatal Python error: Segmentation fault Current thread 0x00007f52959bb740 (most recent call first): File "/home/kimlab1/strokach/anaconda/lib/python3.6/site-packages/pyarrow/parquet.py", line 125 in read_row_group File "<string>", line 1 in <module> ./test_fail.sh: line 5: 42612 Segmentation fault (core dumped) python -X faulthandler -c "import torch; import pyarrow.parquet as pq; _ = pq.ParquetFile('example.parquet').read_row_group(0)"
The number of iterations before a segfault varies, but it usually happens within the first several calls.
python -X faulthandler -c "import pyarrow.parquet as pq; import torch; _ = pq.ParquetFile('example.parquet').read_row_group(0)"
works without a problem.
The backtrace looks like the same problem as we see with the tensorflow wheels. They both use a newer libstdc++ version and when pyarrow is imported last, the virtual shared pointer destructor will be picked up from the torch libstdc++.
One should have a look at the symbols exported by the Torch wheel, maybe it could help if they would employ also the same symbol hiding as we do in pyarrow.