Using Intel Arc A770, run prompt, get munmapchunk(); invalid pointer #13396
Unanswered
stephepush
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm not sure if I should post here or in the issues thread for the OpenVINO fork, but I'm having an issue when trying to generate an image, where I just get `munmapchunk(); invalid pointer
Here's the read out of what I'm getting.
Using TCMalloc: libtcmalloc_minimal.so.4
fatal: No names found, cannot describe anything.
Python 3.10.12 (main, Jun 11 2023, 05:26:28) [GCC 11.4.0]
Version: 1.5.1
Commit hash: c371db5
Installing torch and torchvision
Looking in indexes: https://pypi.org/simple, https://download.pytorch.org/whl/cu118
Collecting torch==2.0.1
Using cached https://download.pytorch.org/whl/cu118/torch-2.0.1%2Bcu118-cp310-cp310-linux_x86_64.whl (2267.3 MB)
Collecting torchvision==0.15.2
Using cached https://download.pytorch.org/whl/cu118/torchvision-0.15.2%2Bcu118-cp310-cp310-linux_x86_64.whl (6.1 MB)
Collecting typing-extensions
Using cached typing_extensions-4.8.0-py3-none-any.whl (31 kB)
Collecting filelock
Using cached filelock-3.12.4-py3-none-any.whl (11 kB)
Collecting sympy
Using cached https://download.pytorch.org/whl/sympy-1.12-py3-none-any.whl (5.7 MB)
Collecting networkx
Using cached networkx-3.1-py3-none-any.whl (2.1 MB)
Collecting triton==2.0.0
Using cached https://download.pytorch.org/whl/triton-2.0.0-1-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (63.3 MB)
Collecting jinja2
Using cached https://download.pytorch.org/whl/Jinja2-3.1.2-py3-none-any.whl (133 kB)
Collecting pillow!=8.3.*,>=5.3.0
Using cached Pillow-10.0.1-cp310-cp310-manylinux_2_28_x86_64.whl (3.6 MB)
Collecting numpy
Using cached numpy-1.26.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (18.2 MB)
Collecting requests
Using cached requests-2.31.0-py3-none-any.whl (62 kB)
Collecting lit
Using cached lit-17.0.1-py3-none-any.whl
Collecting cmake
Using cached cmake-3.27.5-py2.py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (26.1 MB)
Collecting MarkupSafe>=2.0
Using cached MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (25 kB)
Collecting urllib3<3,>=1.21.1
Using cached urllib3-2.0.5-py3-none-any.whl (123 kB)
Collecting idna<4,>=2.5
Using cached https://download.pytorch.org/whl/idna-3.4-py3-none-any.whl (61 kB)
Collecting charset-normalizer<4,>=2
Using cached charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (201 kB)
Collecting certifi>=2017.4.17
Using cached certifi-2023.7.22-py3-none-any.whl (158 kB)
Collecting mpmath>=0.19
Using cached https://download.pytorch.org/whl/mpmath-1.3.0-py3-none-any.whl (536 kB)
Installing collected packages: mpmath, lit, cmake, urllib3, typing-extensions, sympy, pillow, numpy, networkx, MarkupSafe, idna, filelock, charset-normalizer, certifi, requests, jinja2, triton, torch, torchvision
Successfully installed MarkupSafe-2.1.3 certifi-2023.7.22 charset-normalizer-3.2.0 cmake-3.27.5 filelock-3.12.4 idna-3.4 jinja2-3.1.2 lit-17.0.1 mpmath-1.3.0 networkx-3.1 numpy-1.26.0 pillow-10.0.1 requests-2.31.0 sympy-1.12 torch-2.0.1+cu118 torchvision-0.15.2+cu118 triton-2.0.0 typing-extensions-4.8.0 urllib3-2.0.5
Installing gfpgan
Installing clip
Installing open_clip
Installing requirements for CodeFormer
Installing requirements
Launching Web UI with arguments: --skip-torch-cuda-test --precision full --no-half
no module 'xformers'. Processing without...
no module 'xformers'. Processing without...
No module 'xformers'. Proceeding without it.
Warning: caught exception 'Found no NVIDIA driver on your system. Please check that you have an NVIDIA GPU and installed a driver from http://www.nvidia.com/Download/index.aspx', memory monitor disabled
Loading weights [6ce0161689] from /home/cbytes/stable-diffusion-webui/models/Stable-diffusion/v1-5-pruned-emaonly.safetensors
Running on local URL: http://127.0.0.1:7861
To create a public link, set
share=True
inlaunch()
.Startup time: 103.1s (launcher: 99.0s, import torch: 1.2s, import gradio: 0.3s, setup paths: 0.3s, other imports: 0.3s, load scripts: 1.0s, create ui: 0.7s, gradio launch: 0.2s).
Creating model from config: /home/cbytes/stable-diffusion-webui/configs/v1-inference.yaml
LatentDiffusion: Running in eps-prediction mode
DiffusionWrapper has 859.52 M params.
Applying attention optimization: InvokeAI... done.
Model loaded in 17.1s (load weights from disk: 1.0s, create model: 0.2s, apply weights to model: 15.7s).
Loading weights [6ce0161689] from /home/cbytes/stable-diffusion-webui/models/Stable-diffusion/v1-5-pruned-emaonly.safetensors
OpenVINO Script: created model from config : /home/cbytes/stable-diffusion-webui/configs/v1-inference.yaml
You have disabled the safety checker for <class 'diffusers.pipelines.stable_diffusion.pipeline_stable_diffusion.StableDiffusionPipeline'> by passing
safety_checker=None
. Ensure that you abide to the conditions of the Stable Diffusion license and do not expose unfiltered results in services or applications open to the public. Both the diffusers team and Hugging Face strongly recommend to keep the safety filter enabled in all public facing circumstances, disabling it only for use-cases that involve analyzing network behavior or auditing its results. For more information, please have a look at huggingface/diffusers#254 .0%| | 0/20 [00:00<?, ?it/s]munmap_chunk(): invalid pointer
If you need anymore info from me, I'll share it, though I'll wait for the request in case whoever answers tells me to ask on the OpenVINO toolkit page :)
Beta Was this translation helpful? Give feedback.
All reactions