You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to characterize how does the GPU power and performance changes in undervoltage scenarios, but when I change the voltage in the PowerTables and select the desired performance level for the GPU Core, the voltage is the same as for the default settings.
I'm running CentOS 7.5 with ROCm 3.0.
I've followed the instalation instructions from git/ROCm and issue (with the necessary changes for CentOS not ubunto) and configured the PowerTables accordingly with issue
I've written a script that performs this procedure:
scl enable devtoolset-7 bash
rocm-smi -r
rocm-smi --setfan 255
rocm-smi --setperflevel manual
rocm-smi --setslevel with the following values
Level | Freq | Volt
0 852 810
1 991 822
2 1138 834
3 1269 846
4 1348 858
5 1440 860
6 1528 872
7 1600 1000
rocm-smi --setsclk 7
rocm-smi --showvoltage
What I see is that independently of the voltage that I select for the level 7, the output of --showvoltage is always
========================ROCm System Management Interface========================
================================================================================
GPU[1] : Voltage (mV): 1187
================================================================================
==============================End of ROCm SMI Log ==============================
What can I be doing wrong?
The text was updated successfully, but these errors were encountered:
I just noticed that if I'm using the default frequency (in this case 1600 MHz for the level 7), the voltage does not respect what I wrote for that level, the GPU uses the default 1200 mV. However, If I put any other frequency (for example 1601 MHz or 1599 Mhz), the voltage that I write to that level starts to be respected by the GPU.
I don't understand this behaviour, I don't see it written anywhere...
I'm trying to characterize how does the GPU power and performance changes in undervoltage scenarios, but when I change the voltage in the PowerTables and select the desired performance level for the GPU Core, the voltage is the same as for the default settings.
I'm running CentOS 7.5 with ROCm 3.0.
I've followed the instalation instructions from git/ROCm and issue (with the necessary changes for CentOS not ubunto) and configured the PowerTables accordingly with issue
I've written a script that performs this procedure:
Level | Freq | Volt
0 852 810
1 991 822
2 1138 834
3 1269 846
4 1348 858
5 1440 860
6 1528 872
7 1600 1000
What I see is that independently of the voltage that I select for the level 7, the output of --showvoltage is always
What can I be doing wrong?
The text was updated successfully, but these errors were encountered: