Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Voltage in PowerTable not being respected - Vega 64 FF - ROCm 3.0 #1015

Closed
theRTLmaker opened this issue Feb 20, 2020 · 2 comments
Closed

Voltage in PowerTable not being respected - Vega 64 FF - ROCm 3.0 #1015

theRTLmaker opened this issue Feb 20, 2020 · 2 comments

Comments

@theRTLmaker
Copy link

I'm trying to characterize how does the GPU power and performance changes in undervoltage scenarios, but when I change the voltage in the PowerTables and select the desired performance level for the GPU Core, the voltage is the same as for the default settings.

I'm running CentOS 7.5 with ROCm 3.0.

I've followed the instalation instructions from git/ROCm and issue (with the necessary changes for CentOS not ubunto) and configured the PowerTables accordingly with issue

I've written a script that performs this procedure:

  1. scl enable devtoolset-7 bash
  2. rocm-smi -r
  3. rocm-smi --setfan 255
  4. rocm-smi --setperflevel manual
  5. rocm-smi --setslevel with the following values
    Level | Freq | Volt
    0 852 810
    1 991 822
    2 1138 834
    3 1269 846
    4 1348 858
    5 1440 860
    6 1528 872
    7 1600 1000
  6. rocm-smi --setsclk 7
  7. rocm-smi --showvoltage

What I see is that independently of the voltage that I select for the level 7, the output of --showvoltage is always

========================ROCm System Management Interface========================
================================================================================
GPU[1]          : Voltage (mV): 1187
================================================================================
==============================End of ROCm SMI Log ==============================

What can I be doing wrong?

@nikAizuddin
Copy link

@TheEmbbededCoder Hi, did you solved the problem?

@theRTLmaker
Copy link
Author

@TheEmbbededCoder Hi, did you solved the problem?

I just noticed that if I'm using the default frequency (in this case 1600 MHz for the level 7), the voltage does not respect what I wrote for that level, the GPU uses the default 1200 mV. However, If I put any other frequency (for example 1601 MHz or 1599 Mhz), the voltage that I write to that level starts to be respected by the GPU.
I don't understand this behaviour, I don't see it written anywhere...

Do you have any idea why is this the case?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants